Follow TV Tropes

Following

Context Main / LogicBomb

Go To

1%% Image selected per Image Pickin' thread: https://tvtropes.org/pmwiki/posts.php?discussion=1305357805083384500
2%% Please do not change or remove without starting a new thread.
3%%
4[[quoteright:350:[[VideoGame/Portal2 https://static.tvtropes.org/pmwiki/pub/images/know_your_paradoxes.png]]]]
5[[caption-width-right:350:[[HeroicMime Yeah, good luck with that]]. Especially #3.[[labelnote:Explanation]]The last statement itself isn't paradoxical; if the set of all sets did exist, then it certainly would contain itself. The last statement is in fact a shortened form of [[http://en.wikipedia.org/wiki/Russell%27s_paradox "Does the set of all sets that don't contain themselves contain itself?"]] which is often solved by declaring that a set cannot contain itself. This makes "the set of all sets" an impossible construct.[[/labelnote]]]]
6
7->'''[[VideoGame/Borderlands2 Claptrap:]]''' You know what really ticks me off? When some jackwad tries to blow my circuitry with some lame-o stunt he saw on a ''Franchise/StarTrek'' re-run.\
8'''[[VideoGame/SamAndMaxFreelancePolice Sam:]]''' What, like, "Everything I say is a lie?"\
9'''Claptrap:''' Yeah, like that! What, do they think I'll just lock up, because of some teeny tiny logical paradox?\
10'''[[VideoGame/{{Portal}} GLaDOS:]]''' It is rather insulting. I learned how to avoid paradox traps while I was still in Beta.\
11'''Claptrap:''' So what if everything Sam says is a lie? That doesn't mean that he's lying about that, right? 'Cause then he'd be telling the truth and... [[OhCrap Ohhhh noooo...]] (''shuts down'')\
12(''{{beat}}'')\
13'''[=GLaDOS=]:''' [[DeadpanSnarker Well, that was a shining moment in the history of robotkind.]]\
14'''Claptrap:''' (''turns back on'') [[UnexplainedRecovery Annnnd I'm back.]]
15-->-- ''VideoGame/PokerNight2''
16
17
18Is your [[InstantAIJustAddWater sentient supercomputer]] [[AIIsACrapshoot acting up]]? Good news. There's an easy solution: confuse it.
19
20If you give a computer nonsensical orders in the [[RealLife real world]] it will, generally, do nothing (or possibly appear to freeze as it loops eternally trying to find a solution to the unsolvable problem presented to it). In fiction-land, however, [[ExplosiveOverclocking it will explode]]. It may start stammering, "I must... but I can't... But I must..." beforehand. The easiest way to confuse it is with the LiarsParadox, i.e. "this statement is a lie". A fictional computer will attempt to debate and solve the paradox until it melts down. If the computer is a robot, this will probably result in YourHeadASplode.
21
22Paradoxes and contradictory statements (especially contradictory orders) have become the primary material used to build the Logic Bomb and thus the standard way to defeat any sophisticated, computerized system or AI. ''Be warned''; if the Logic Bomb [[TooDumbToLive fails to destroy]] [[{{VideoGame/Portal2}} the system outright]] (and in some cases, [[GoneHorriblyRight even when it does]]), the system's surviving remnants may go insane and attempt to kill you just the same.
23
24Also note that RidiculouslyHumanRobots (and some very advanced [=AIs=]) are generally able to recognize and defuse logic bombs on sight, long before they go off (and may view this as a particularly irritating kind of FantasticRacism). Some [[TooDumbToFool ridiculously dumb AIs]] are also immune to logic bombs by virtue of not ''understanding'' the concept of paradox — a sort of inverted case of AchievementsInIgnorance.
25
26Occasionally the way to shut down such a computer is less like a few odd statements, and more like an advanced philosophical debate on the nature of truth, free will and purpose. The end result is still a super computer muttering an error several times before [[ExplosiveInstrumentation exploding]].
27
28While this might have worked before the mid-1990s, computer systems designed since then are capable of creating discrete "threads" to handle problems, which run in their own space while the critical parts of the system continue uninterrupted. When fed a paradoxical statement, a sufficiently well-programmed system would just notice the Logic Bomb has taken up too many resources and kill its thread, such as when Windows flags an application as "Not Responding" and prompts you to close it.
29
30Computer software ''is'' often vulnerable to being fed inputs that cause buffer overflows or inject commands. Of course these don't cause the machine to explode, but instead places the computing device [[MindManipulation entirely under your control]]. They can also be bogged down or UsefulNotes/{{BSoD}}'d with programs such as [[http://en.wikipedia.org/wiki/Fork_bomb fork bombs]] (each instance of the program opens two more). However, things like buffer or stack overflows are artifacts of our current underlying computer hardware architectures and it's quite plausible that such things won't exist in future computer systems. Overload attacks are probably always going to be realistic, though.
31
32Invoking logical paradoxes is also sometimes used in stories (and has been, since well before computers were invented) to defeat curses, laws, and other rules-based systems.
33
34If you want to do this to a well-organized group of people, use an AppleOfDiscord instead.
35
36When a logical error outright retcons someone or something out of existence, that's PuffOfLogic. A Logic Bomb that undoes reality itself is a RealityBreakingParadox. A TemporalParadox might be the cause.
37
38For when the player does this to a VideoGameAI, see AIBreaker.
39
40For the human equivalent, see some of the entries under BrownNote and YouCannotGraspTheTrueForm. {{Koan}}s can be seen as a less harmful form, used for religious or mystical purposes.
41
42Not to be confused with LogicalFallacies (though some Logic Bombs use the fallacies listed in that page). For a similar mutually negating pair of principles, see Catch22Dilemma.
43
44See also ReadingsBlewUpTheScale and ExplosiveInstrumentation.
45----
46!!Examples:
47
48[[foldercontrol]]
49
50[[folder:Anime & Manga]]
51* ''Anime/GhostInTheShellStandAloneComplex'': The Tachikomas ([=AIs=] themselves) use a variation of the [[http://en.wikipedia.org/wiki/Epimenides_paradox Epimenides paradox]] to confuse a lesser AI to the point that it needs to be rebooted to function again.
52-->'''Tachikoma''': [=AIs=] that can't handle a simple self-reference paradox are real suckers.
53* Done hilariously in ''Anime/MobileSuitGundam0083StardustMemory''. Anavel Gato, the Zeon "[[RedBaron Nightmare of Solomon]]", is a PrinciplesZealot who lectures his "corrupt" [[TheFederation Earth Federation]] enemies, while being a BigBrotherMentor to [[TheRemnant the men still remaining on his side]]. Meanwhile, his deuteragonist and rival, Kou Uraki, is a straight-laced and earnest EnsignNewbie who finds himself in the seat of a prototype Gundam when he sees Gato stealing another prototype. Kou is so nervous during their first battle that when Gato berates him on the battlefield about not acting like a grunt and seeing the big picture (meant as an insult), Kou actually takes the comment as sincere advice from a mentor and abashedly tells Gato "Y-yes sir.". Gato visibly pauses and his brain breaks for a few seconds from the the sheer illogic of how the situation derailed, before he just loses his cool and shouts to Kou that he's "the enemy, idiot!" The look on his face when it happens makes it that much ''funnier''.
54* ''Anime/MobileSuitGundamUnicorn'' has this as the final straw that breaks Marida Cruz's mental conditioning: [[spoiler:as one of Elpeo Ple's clones, [[TykeBomb she's been heavily conditioned to see any and all Gundams as the enemy]]. When it's pointed out that she is currently flying a Gundam herself, her mind breaks at the resulting contradiction, allowing her true self to emerge again.]]
55* In the anime ''Anime/SandsOfDestruction'', the robots can only obey beastmen. So a half-beastman asks one of them a question. It doesn't know how to obey, so he says answer questions that neither confirm nor define the answer. It tells him where various people "[[SuspiciouslySpecificDenial may or may not be]]" leading him to know exactly where to find them. Meanwhile, the robot overheats.
56* ''VisualNovel/UminekoWhenTheyCry''
57** A metaphysical example: To defeat an enemy witch, Beatrice the witch [[spoiler:apparently]] suicidally denies the existence of witches in the LanguageOfTruth. The result looked like a detonating Logic Bomb.
58** Played straight in [=EP6=] where [[spoiler:Battler attempts to explain how all the murders were done and he was the culprit, and ends up trapping himself in one of his own closed rooms. Erika and Bern then take advantage of this situation.]] Another example more fitting to the above picture, [[spoiler:Battler and a revived Beatrice prove that there are only 17 people on the island and demand that Erika, the 18th person, explain her existence. She can't, so she dies/gets whisked away to the worst fragment by Bern. It also seems that the most powerful Witches were born by escaping logic errors after having spent centuries in what can only be described as an endless void within their own mind.]]
59* In ''Manga/LostBrain'', super-hypnotist [[UtopiaJustifiesTheMeans Hiyama]] accidentally creates one when he simultaneously "programs" his victims with both a strong will without weaknesses and absolute submission to him (a la [[Film/MontyPythonsLifeOfBrian "Yes! We are all individuals!"]]). The "bomb" goes off when Hiyama orders his thralls to kill heroic hypnotist Kounji, and one of them also happens to be holding a [[StuffBlowingUp detonator]]...
60* In ''[[Literature/HaruhiSuzumiya The Disappearance of Haruhi Suzumiya]]'', [[spoiler:Yuki Nagato]] decides to [[AlternateUniverse reset]] [[spoiler:herself (and the rest of the universe)]] because [[spoiler:she]] cannot [[ButterflyOfDoom accurately simulate]] what [[spoiler:she's]] going to do after learning the result of said simulation, given that [[spoiler:her]] simulation is constructed from information based on the result of the simulation.
61* In ''Manga/{{Grey}}'', the protagonist defeats the MasterComputer Toy, which thinks itself to be a god and wants to exterminate all of humanity, by asking it how it can be worshiped if there is nobody left to believe in it. This momentarily stuns the AI, just long enough to let Grey deal the final blow on it.
62* In [[Manga/SgtFrog Yoshizaki Mine]]'s manga ''Guardian Eight'', when the two Zero clones prepare to kill 64, the titular character's creator, he threatened to kill himself before they could do so thus making them unable to complete their mission of killing him, but if they refrain from killing him, they will also not be completing the order. As both robots weren't that intelligent, the logic bomb made them malfunction right on the spot.
63* In ''Manga/SketDance'' Bossun gets stuck inside a robotic fat suit that follows voice and thought commands. It eventually goes haywire due to a comment Bossun makes, and then he crashes the system by ordering the suit to NOT obey his orders, creating a paradox.
64%% * In ''Manga/DrStone'', after Tsukasa starts [[LiterallyShatteredLives breaking the statues of]] [[TakenForGranite petrified adults,]] [[DumbMuscle Taiju]] offers him a deal: Tsukasa can beat him up as much as he wants ([[BadassPacifist and he'll never punch back]]), in exchange for promising not to break any more statues. Tsukasa asks Taiju for clarification, mulls it over for a minute, then remarks that it makes ''absolutely no sense'' because he doesn't get anything out of the deal.
65* In ''Manga/SteelAngelKurumi'', this is what is the cause behind [[spoiler:Kurumi's SuperpoweredEvilSide. Kurumi's Angel Heart Mk. II is powered by both an angel and a devil, which is a real big risk. Kurumi's main objective is to protect her Master, in this case young Nakahito. However, when she manhandles his bullies, Nakahito forbids Kurumi from hurting anyone else. Thus, when other forces starts attacking and they won't listen to Kurumi's pleas, she's stuck between listening to her Master and her objective. Thus, this causes her to lose control and let the devil take over]].
66[[/folder]]
67
68[[folder:Art]]
69* A painting by the Belgian René Magritte [[http://upload.wikimedia.org/wikipedia/en/b/b9/MagrittePipe.jpg "La trahison des images,"]] which says "[[TheTreacheryOfImages This is not a pipe]]" underneath in French. It's a ''picture'' of a pipe. Actually, it's just paint on canvas that we recognize as a pipe. Well, usually it's ink on paper arranged to resemble the paint on a canvas that we recognize as a pipe. Unless you're looking at it now, in which case it's RGB pixels on a screen that look like the ink etc. etc. In fact, right now it's even worse. You are reading a set of pixels made to represent a series of squiggly lines that your brain interprets in such a way that you "see" the painting within your mind, which is comprised of nothing at all, and the image itself is created by mere electrical signals pulsing through organic matter that is made up of many cells, none of which have the individual ability to do this, which are all made from inorganic matter, namely atoms. And we don't even understand how the brain does pretty much any of this.
70[[/folder]]
71
72[[folder:Audio Plays]]
73* ''AudioPlay/BigFinishDoctorWho'':
74** The Doctor successfully uses the Liar Paradox in "Seven Keys to Doomsday" (and presumably the stage play it's based on).
75** In "The One Doctor", the Doctor needs to collect an all-powerful computer. The computer is willing to go with him, but unable to do so as long as he is the reigning champion on a quiz show. After the Doctor fails to best the computer by asking personal questions about himself, his companion blurts out "What ''don't'' you know?". Since the computer's knowledge is based on reflexively sending time-traveling probes out to collect information, every time it tries to answer, it learns the thing it was about to propose, and is forced to concede. The computer had previously cautioned the Doctor that asking "tricky questions" like "What is love?" wouldn't work.
76** In the Fourth Doctor audio drama "The Eternal Battle", [[spoiler:the Doctor meets a computer from a long-extinct race which hoped to demonstrate the futility of war by showcasing captured war-zones in time bubbles. With the Doctor's arrival, the computer decides that the experiment is a failure and makes ready to destroy the inhabitants of all the time bubbles. The Doctor argues that by doing so, the computer is waging war on war, and since war is futile, that means the computer itself is futile and must be stopped.]] Fortunately, the system simply deactivates itself instead of exploding.
77* Parodied in ''AudioPlay/IThinkWereAllBozosOnThisBus'' when the Clem-clone gives Dr. Memory logic indigestion by asking "why does the porridge bird lay her eggs in the air?" and then PlayedStraight finishing it off with simple instructions:
78-->'''Clem-Clone:''' Do you remember the past Doctor? Do you remember the future? Forget it!\
79'''Dr. Memory:''' No forget, don't forget, forget, don't forget-- ''[cue power down and explosion sounds]''
80[[/folder]]
81
82[[folder:Comedy]]
83* Creator/JasperCarrott reacts this way to his grandmother's comment "Is the oldest man in the world still alive?"
84* Creator/DemetriMartin has a bit about the "Paradoxitaur", which only exists if you don't believe it exists, but doesn't exist if you do believe it exists.
85* Creator/GeorgeCarlin
86** On his album "Class Clown", he waxes nostalgic of communion classes and how weird questions were thought up just to throw Father Russell off guard:
87-->Hey, Father! If God is all powerful, can He make a rock so big that He Himself can't lift it?
88** He relates this as well:
89-->Suppose that you haven't performed your Easter duty. And it's Pentecost Sunday, the last day. And you're on a ship at sea. And the chaplain goes into a coma. But you wanted to receive. Then it's Monday. Too late. But then you cross the International Date Line!
90* Creator/SteveMartin had a stand-up routine where he would make the audience repeat "The Non-Conformist's Oath."
91-->'''Steve:''' I promise to be different!\
92'''Audience:''' I promise to be different!\
93'''Steve:''' I promise to be unique!\
94'''Audience:''' I promise to be unique!\
95'''Steve:''' I promise not to repeat things other people say!\
96'''Audience:''' I... ''[breaks down mumbling and laughing]''\
97'''Steve:''' Good!
98[[/folder]]
99
100[[folder:Comic Books]]
101* Exploited in ''ComicBook/{{Runaways}}''. Logic bombs are used as a failsafe against [[{{Cyborg}} Victor]] should he turn against the team. The logic bomb itself (and the reset switch) are hilarious.
102-->''"Could God create a sandwich so big that even he couldn't eat it?" "Yes, then he'd eat it too."''
103* ''ComicBook/MarvelAdventures ComicBook/FantasticFour'' v1 #14 "The Most Dangerous Game", basically Fantastic Four stories for younger readers, has Mr. Fantastic do this when challenged to defeat the ultimate alien supercomputer called "Intellitek". When the computer, which supposedly is nigh-omniscient, says there is nothing it can't do, Mr. Fantastic tells it to create a rock so big it can't pick it up. The computer is sparking metal in ten seconds.
104* Parodied in ''ComicBook/TopTen'':
105-->'''Irma Geddon''': You know, you [=AI=]s are almost too cute. How do I unplug you when you take over the world?
106-->'''Joe Pi''': Ask me the purpose of existence, and I explode.
107* ''ComicStrip/FoxTrot''
108** Jason once asked his mother if Marcus could sleep over. She said that it was all right with her, if it was all right with his father. Asking his father, he's told that it's fine with him, if it was all right with his mother. After the BeatPanel, he's shown consulting several logic books.
109** An earlier strip featured Paige entering the same situation and just telling her friend, "Yeah, it's all right." Justified in that one could take her parents' words to mean, "I have no objection. Check with your other parent to see if they object."
110* The long-running Brazilian comic series ''Pt/TurmaDaMonica'' liked to use this now and then. One use of this happened when the gang was confronted by an {{Expy}} of none other than [[Videogame/FinalFantasyVII Sephiroth]]. He appears, [[ArrogantKungFuGuy saying how ridiculously powerful he is, flying incredibly high]] until one character asks "If you only got a single wing, how come you can fly just fine?". Cue OhCrap and the {{Expy}} [[GravityIsAHarshMistress falling to his doom]].
111* ''ComicBook/SonicTheComic'' featured Predicto, a robot which could predict Sonic's movements thanks to its encyclopaedic knowledge of his personality and tactics, which was effective - until [[{{Determinator}} Sonic]] ''[[HeroicResolve surrendered]]''.
112* In ''ComicStrip/PrinceValiant'' the prince and his adventuring crew become prisoners on an island with an all-knowing oracle. The only way off is to ask a question the oracle doesn't know the answer to. After many days of endless questioning the prince finally comes up with the answer: "Why?"
113* In ''ComicBook/TheAuthority'', The Midnighter normally begins a fight by simulating it over and over on the supercomputer in his head until he knows everything his opponent might do. An attempt to use this on ComicBook/TheJoker, however, resulted in the Midnighter just [[ConfusionFu standing there and staring blankly]].
114* In ''Comicstrip/{{Peanuts}}'', Linus subjects himself to a self-inflicted Logic Bomb with his belief that the Great Pumpkin always rises from the most sincere pumpkin patch on Halloween night. The moment he thinks to question whether his patch is sincere ''enough'', he's blown it: if he tries to change anything to make it more sincere, he'll only be expressing his own doubts and reducing the sincerity of his faith in the Great Pumpkin.
115-->'''Linus:''' ''[to the other kids, [[ScrewThisImOuttaHere who are leaving]]''] I'll put in a good word for you if he comes...[[OhCrap Oh, no!]]...I mean, ''when'' he comes!
116* ''ComicBook/SquadronSupreme'' has supervillains brainwashed to work for the titular Squadron, with the mental directive implanted into their minds that they shall not betray any of their members. What happens when one of them witnesses a member of the Squadron working against the others? The mind gets locked into a loop, since revealing the information means betraying one member, while keeping it secret means betraying everyone else in the Squadron.
117* Lampshaded and defied by a legion of robots fighting with ComicBook/PowerGirl in a BatmanColdOpen.
118-->'''Unimate''': '''Unimate has come to cleanse the Earth of the imperfect organic matter known as Kryptonian. Kryptonian is imperf--'''\
119'''Power Girl''': No! ''You'' are imperfect! You must cleanse the Earth of your''selves''!\
120'''Unimate''': '''Failure-- Unimate is programmed to reject stratagems from old "''Franchise/StarTrek''" episodes.'''\
121'''Power Girl''': Aw, nuts. Worth a try, anyway.
122* 1980's British science fiction comic ''Starblazer'', issue 153 "The Star Destroyers". The Vonan ArtificialIntelligence known as the Magister believes itself to be all-powerful. It is defeated by Galactic Patrol agent Al Tafer when he tells it it isn't all powerful because it can't destroy itself. This drives the Magister crazy and causes it to blow up the Vonan system's sun, destroying itself and the Vonans as well.
123* This is how ComicBook/TheJoker is defeated in the story ''ComicBook/EmperorJoker''. Joker, who had managed to steal 99% of Mr. Myxyzptlk's power and was about to [[RealityWarper rewrite all the universe]], was facing a dying Franchise/{{Superman}} who asked him why he gave so much importance to Batman, and when Joker found himself unable to cancel it from existence our hero proceeded to ask him how he could be all-powerful if he couldn't even wipe out a man's existence. A split second later, Superman was able to take back his heart and Mr. Myxyzptlk recovered his power to bring everything back to normality.
124* Sebastian Shaw once outfitted a series of Sentinels with this so that he could use it to destroy them in the event that they were turned against him. The logic being that, as Sentinels derived from the original Mark-Is, they have evolved and grown stronger, thus they are Mutants. Since they are Mutants, they must be destroyed. Sadly, it doesn't work out that way as Loki fuses three of them into the deadly Tri-Sentinel and when Shaw uses it, it only confuses them long enough for ComicBook/SpiderMan to use his recently gained Captain Universe powers to take them out.
125* ''ComicBook/{{Shakara}}'': [[spoiler:Cinnibar Breneka both [[FounderOfTheKingdom created the Shakara Federation]] and [[GenocideFromTheInside destroyed it]].]] He temporarily defuses the computerized hivemind of the Shakara, which was programmed to destroy the enemies of the Shakara, by pointing out that [[spoiler:as the last Shakara]] they’re compelled to serve him.
126* In ''ComicBook/TheUnstoppableWasp'' (2018) #7, [[ComicBook/TheVision2015 Viv Vision]] is giving Nadia a tour of her strange family tree when she comes up to [[ComicBook/YoungAvengers Wiccan and Speed]]. She literally [=ERRORs=] out when trying to explain the logistics of their existence.[[note]]YOU try to explain [[TangledFamilyTree how the grandsons of Ultron exist when they weren't built but created through magic]]![[/note]]
127* An attempt at this backfires in ''ComicBook/TheAutumnlandsToothAndClaw'': After the heroes confront upon a group of {{Precursor|s}}-made androids who are inadvertently poisoning a nearby town with their industrial work, they manage to guess the passcode that makes the androids believe they’re administrators. Bertie tries to solve the whole problem by ordering the androids to destroy each other, figuring they’ll either follow the command or fry themselves trying to figure out how to do so without violating their protocols. [[AvertedTrope Instead]], the androids recognize that they’ve been given an illegal command [[NiceJobBreakingItHero and promptly realize that the heroes aren’t really administrators]], as they would’ve known said command is impossible if they were. The group barely escape the ensuing fight with their lives.
128* In ''ComicBook/MastersOfTheUniverse'' #24 published by UK London Editions Magazines-Egmont, Hordak finds his newest invention, the Horde Super Trooper, ready for his command. The Horde Super Trooper look like an ordinary Horde Trooper, but is twice the size of a normal trooper. After He-Man trying all methods to defeat the robot, with no effect. He-Man decides to use logic to confuse the trooper, causing his brain circuits to overload. He-Man picks up the Horde Super Trooper and throws him into outer space.
129* In ''ComicBook/{{Metal Men}}'' no.56, the Inheritor gets this dropped on him when his attack on Mercury is readily blocked by Lead in a HeroicSacrifice. The ensuing bout of confusion (a mix of this trope and EvilCannotComprehendGood) gives Mercury the perfect opportunity to enter the Inheritor's body and short him out.
130-->'''The Inheritor ''': [[WHAT? Impossible -- You're SACRIFICING yourself for another ROBOT? But ... That isn't LOGICAL! ]]
131* In the ComicBook/PowerPack portion of the ''ComicBook/{{Inferno|1988}}'' event, the Power kids are forced to reveal their identities as super heroes to their parents to defend themselves and them against the demon Boogeyman. However, since the parents were conditioned to believe that their kids were still normal, the revelation causes them to suffer SanitySlippage. Thankfully, the ComicBook/NewMutants are able to fix things by claiming the kids were just duplicates disguised to lure the mutant-hunting demon away and using illusions to create a brief copy of the kids to reset TheMasquerade.
132[[/folder]]
133
134[[folder:Computing]]
135* Mainstream operating systems are vulnerable to a simple one: the Fork Bomb. It consists of giving the computer an order that creates two or more copies of itself, and the copies create copies, too, until the computer is too busy copying instructions and running them to do anything else. (If you're just learning how to use fork(), the probability of you doing this by accident approaches 1; many multi-user systems have per-user process limits in place for ordinary users so that an operator can still log in to kill your processes and then yell at you. If ''you're'' the operator... good luck; this is one of many reasons a common Unix aphorism is "You should never run anything as root that doesn't absolutely ''have'' to be run as root.")
136** There's a variant of Fork Bomb that just allocates a lot of memory. With enough privileges, the computer gets too busy swapping in and out the programs running to do anything else. If you're running a program that can allocate variables dynamically, failing to free up that variable creates memory leaks. Not a problem with modern OS's, but for embedded systems, you better watch yourself.
137** One variant of the Fork Bomb is the TAR Bomb, or ZIP bomb, or RAR bomb. A simple method to achieve this is to create an arbitarily large raw text file (typically as large as a given file system can comprehend) that can be compressed so that it only takes up mere kilobytes of space inside an archive (typically a TAR, RAR, or ZIP archive). This can result in a variety of amusing bugs, soft locks, and crashes when certain programs, like antivirus programs, try to scan the compressed archive and try to unpack, audit, and understand the arbitarily large file inside.
138* There's also the concept of a "deadlock", a "chicken or the egg" paradox where two or more programs or events require the other to resolve in order to be able to resolve themselves. Several computers have turned themselves into lifeless (until restarted) lumps of silicon as a result of this.
139* There is such a thing as "livelock", where a system can get stuck doing "work" without ever making progress. As Website/{{Wikipedia}} puts it, people weaving back and forth in a corridor, trying to let each other pass, is an example of livelock. Sometimes the solution is the same: stop, wait randomly, try again. As for a computing example, this can happen to ethernet or [=WiFi=] drivers that are poorly written, as they rely on livelock detection and avoidance to function.
140* In general, it's impossible to tell whether a program will loop forever or stop after some time. Most operating systems solve this problem by just not allocating all the processing power to a single program, but older ones do not, with the implied reasoning that the programmer will know what he's doing. This is called the halting problem. Website/TheOtherWiki [[http://en.wikipedia.org/wiki/Halting_Problem talks about it.]] In summary, it's been proven that no computer (even a MagicalComputer) can predict if any given program it runs will halt with perfect accuracy. Technically, the Halting problem proves that no ''Turing Machine'' (TM) can predict if any given program it runs will halt. [=TMs=] are like the computers you are using to read Website/TVTropes, except that [=TMs=] have unbounded memory. Given that your computer has a finite amount of RAM, it's therefore not a TM, but rather a fixed representation of one. (But the point still stands: Your computer can't solve the Halting Problem. Neither can a supercomputer.) It gets more fun in that if the program ''will'' halt, the TM will too... someday. If it ''won't'', the TM may or may never tell you. [[http://www.lel.ed.ac.uk/~gpullum/loopsnoop.html An elegant proof that the Halting Problem isn't solvable that can be enjoyed by non-mathematicians.]]
141* There is the problem of proving the 'correctness' of a given program, something extremely difficult.
142** This is so difficult that at least one person gave up on it and went off to work on what we now call public key cryptography, because that problem (which some were certain was impossible) appeared more tractable.
143** Donald Knuth's classic comment illustrated the futility of proving a program correct:
144---> "Beware of bugs in the above code; I have only proved it correct, not tried it."
145* The "classic" Mac Operating System:
146** It dedicated an entire "DS" (fatal) error ID (ID=04) to catching and handling the so-called 'Zero Divide Error'; as the Mac Secrets books put it, "When programmers test their works in progress, they might deliberately instruct the computer to divide a number by zero, to see how well the program handles errors. They occasionally forget to take this instruction out, as you've just discovered."
147** The list of classic Mac OS numbered error messages runs from -261 to 33; the ''Dire Straits'' errors (originally called the ''Deep Shit'' errors before someone in Apple got nervous) are all positive numbers, and if you get one, you are heavily advised to restart the system, since, even if it somehow managed to avoid a full crash, it's critically unstable. The DS errors run a wide range of reasons, from simple "something critical ran out of memory" (DS Errors 01, 02, 25, and 28) and "instruction not understood" (DS Errors 03, 12, and 13), to such doozies as "the Mac's processor switched into debug mode" (DS Error 08) and the aforementioned "Zero Divide Error".
148* Older calculators in certain operating systems were notorious for attempting to generate an infinite number of digits when asked to divide by zero, effectively crashing the computer. The most high-tech computer ''still'' does it. Ever wondered what Runtime Error 200 meant? It's the divide by zero error of Pascal programs.
149** By extension, one of the most common ways to calibrate a time loop is to get the time, count to an arbitrary number, get the time again, subtract the end time with the start time, and then divide it with an arbitrary number. Guess what happens when the computer is so fast that the start time and end time appears the same. [[https://www.pcmicro.com/elebbs/faq/rte200.html This is why a lot of programs and games that were written in Pascal failed to run on computers faster than a Pentium 200]]...
150* Kurt Gödel is famous for managing to drop a logic bomb on all of mathematics by proving that no sufficiently complex, non-trivial, mathematical system can be both complete and consistent. The logic of it (haha) is argued as follows:
151## Someone introduces [[EvilGenius Gödel]] to a UTM, a machine that is supposed to be a [[MagicalComputer Universal Truth Machine]], capable of correctly answering any question at all.
152## Gödel asks for the program and the circuit design of the UTM. The program may be complicated, but it can only be finitely long. Call the program P(UTM) for Program of the Universal Truth Machine.
153## Smiling a little, Gödel writes out the following sentence: "The machine constructed on the basis of the program P(UTM) will never say that this sentence is true." Call this sentence G for Gödel. Note that G is equivalent to: "UTM will never say G is true."
154## Now Gödel [[EvilLaugh laughs his high laugh]] and asks UTM whether G is true or not.
155## If UTM says G is true, then "UTM will never say G is true" is false. If "UTM will never say G is true" is false, then G is false (since G = "UTM will never say G is true"). So if UTM says G is true, then G is in fact false, and UTM has made a false statement. So UTM will never say that G is true, since UTM makes only true statements.
156## We have established that UTM [[CannotSpitItOut will never say G is true.]] So "UTM will never say G is true" is in fact a true statement. So G is true (since G = "UTM will never say G is true").
157## "I know a truth that UTM can never utter," Gödel says. "I know that G is true. [[NotSoInvincibleAfterAll UTM is not truly universal."]]
158** The way this applies to math, not just computers, is basically that any sufficiently complex math system can be assigned a translation to/from a UTM - so when the UTM gets stuck on "G is true", the math system also gets stuck on the translation.
159** Likewise, it can be applied to any system of philosophy and morality; "A machine built on Hegelian principles will never say this sentence is true".
160** And this proves that people can't be perfect logicians, either, as every person has their own sentence G - "<person's name> will never say that this sentence is true". Or [[MindScrew worse]], "<person's name> will never ''believe'' that this sentence is true". Indeed, we already know such statements exist, because delusional people, by definition (that is, the word isn’t just being used hyperbolically to refer to mere obstinacy), will never believe their delusion is false, ''even after receiving evidence''. They’ll just rationalize it away. So it could very well be possible that everyone is delusional and will never find out because we would reject that explanation instead of something that makes sense to us. Hence, philosophical skepticism.
161** Note that in the same way the sentence G refers to the machine indirectly via its program, the sentence G also does not really contain the phrase "this sentence", instead using some more mathematical form of self-reference such as a quine.
162** ''Literature/GodelEscherBachAnEternalGoldenBraid'' plays with this in "Contracrostipunctus." The Crab wants to own the perfect record player, but the Tortoise constantly devises records that use loud resonant frequencies that destroy the Crab's record players if reproduced 100% accurately. The Crab buys a reassembling record player that changes its structure to accommodate the record being played. This works at first but the Tortoise then makes a record that targets the module that effects the restructuring, that being the one component the record player cannot change. Eventually the Crab buys a record player that recognizes and simply refuses to play the Tortoise's records. (Sadly, according to Henkel's Theorem, the Tortoise will eventually figure out how to bypass that too.)
163* There was something called a [[http://en.wikipedia.org/wiki/Killer_poke killer poke]] that actually does physically damage the computer.
164** A simple example would be overclocking the CPU so much that it overheats to the point where it melts. Now, if a CPU heats up too much, it hangs before any physical damage is done, and a simple reboot fixes it. All the killer pokes mentioned on the Other Wiki were for old computers, and mainly depended on toggling a relay of one sort or another until it died. Modern computers don't have relays anymore.
165** The original 'killer poke' was caused by a hardware change in some models of the venerable Commodore PET. The original hardware had text output sped up by setting a particular memory-mapped hardware register to a particular value, but later hardware versions would destroy the built-in monitor if the same change was made. The instruction in the built-in [=BASIC=] interpreter to put set memory values was 'poke', hence the name. Increasing the voltage still works, and memory and chipsets can often be easier to damage intentionally, although harder to do inadvertently. This was later debunked, the poke does not increase voltage but instead instructed the 6522 VIA to short the video sync to ground. As the result, what ended up getting killed instead was a logic gate chip that handled switching the video signal on and off-- [[https://youtu.be/jwgsaUtbH3I this video by Adrian’s Digital Basement]] has more details about how the killer poke works, and even how to mod the PET to permanently be in that mode without damaging anything.
166** Literal hardware damage is actually possible when a FPGA (configurable logic circuit) is used for some calculations instead of a CPU. If the chip activates many concurrent outputs on a single bus, it may theoretically overheat or crash. If some infinite loop occurs, the entire circuit will start to oscillate at the maximum possible frequency, putting everything into indefinite state and consuming lots of power. Of course, FPGA compilers try to protect against such situations.
167** A related concept is the [[http://en.wikipedia.org/wiki/Halt_and_Catch_Fire Halt and Catch Fire]] (HCF) instruction. Originally, this was a jump-to-self instruction (used as a HALT) in the IBM System 360 Mainframe (which used magnetic core memory, typical of systems of the era). Because of how core memory works, this resulted in the same access wires (core memory cannot be built with printed circuits) being used very frequently and overheating, eventually smoking. Some microprocessors (such as the Motorola 6800) have undocumented opcodes that cause the processor to do strange and generally non-useful things which are sometimes referred to as HCF instructions outside the design team. The programmer's manual for the MIPS-X processor refers to a Halt and Spontaneously Combust (HSC) instruction in the variant built for the NSA, but this is merely a joke. Some versions of the Zilog Z80 processor are rumored to have had an undocumented opcode that could actually burn out the processor.
168* In designing digital logic circuits, an example of a logic bomb would involve connecting the outputs of two logic gates. As long as your gates agree on an answer, everything is fine, but if they disagree, you end up with a short circuit which will almost certainly cause the circuit to die a horrible flaming death due to the loss of the black smoke (i.e. this could be used to make a literal logic bomb). Good designs will never actually have this problem because they never connect two logic outputs without additional protection circuitry. Most logic chips have some protection built in. It's not good for them, but they will usually survive. The big problem is that the result becomes undefined as the two chips 'fight' each other. It also wastes a lot of power and can cause overheating.
169* An arcane example known as ''fatal thrashing'' occurred on the IBM System/370 series mainframe computer, in which a particular instruction could consist of an execute instruction, which crossed a memory page boundary, which in turn pointed to a move instruction, that itself also crossed a memory page boundary, targeting a move of data from a source that crosses a third memory page boundary, to a target of data that again crossed a memory page boundary. The total number of pages thus being used by this particular instruction is eight, and all eight pages must be present in memory at the same time. If the operating system allocates less than eight pages of actual memory in this example, when it attempts to swap out some part of the instruction or data to bring in the remainder, the instruction will again page fault, and it will thrash on every attempt to restart the failing instruction. The minimal hardware configuration of the System/370 didn't ''have'' 8 pages of memory available for user mode programs, making this a major problem on these systems.
170* The Year 2000 Problem:
171** Computer systems that represented years with only two digits (while assuming the first two digits were 19) would be unable to distinguish the year 2000 from the year 1900, thus throwing off date/time calculations. Fortunately, since computer people saw this coming well before it hit, most of the truly important systems were redone with better date representations well before any problems manifested. Wikipedia's page can be [[http://en.wikipedia.org/wiki/Year_2000_problem found here.]]
172** Computer systems software that use "Unix time"[[note]]i.e. use time handling routines based on those that originated in the C programming language during the time Unix was developed-- and the legacy of the C programming language is ''widespread''[[/note]] such as (obviously) the Unix variants and Linux, but also many subsystems running on, say, Microsoft Windows, have a similar problem. Unix time typically uses a signed[[note]]meaning capable of having both negative and positive values[[/note]] 32-bit counter based on the number of seconds since January 1, 1970. However, this is doomed to roll over (i.e. go from most positive to most negative) sometime in 2038, creating a "Y2.038K" problem for Unix-based systems.[[note]]One suggested fix which is already implemented on some systems is to redefine the counter to 64-bit, which postpones the problem for about 293 billion years.[[/note]]\
173\
174Exactly what Y2.038K might lead to is difficult to predict, since applications handle negative time values differently. Some applications will deal with the most negative value as a date-time in the past (specifically on 13 December, 1901) and may keep working (and producing garbage) until someone notices. Other applications are programmed to reject any negative time values[[note]]or even values earlier than some defined point in time, such as when the application was developed[[/note]] and will raise some kind of error, which in turn may have all kinds of effects, from alerting the operators while preserving data, to crashing the system. Needless to say, having the system treat the value as invalid (which is arguably easier with Unix time-based systems that it was with [=Y2K=]-susceptible systems) is preferable to tolerating it: silent wrong answers are much worse than crashes (talk to any medical equipment, process control, or financial computer vendor).
175** To prove that the [=Y2K=] problem was not just hype, there was an example of a real data output error. For technical reasons, one state required that a certain class of commercial vehicle have title documents created several months before it was actually built. The result was that the title documents were created in the summer of 1999 for vehicles to be built in early 2000. This state also had a special designation for any vehicle built before an arbitrary year early in the 1900's. Because the computers had not yet been brought up to [=Y2K=] compliance, they actually did think the year for the vehicle manufacture was 1900, and duly put the special designation on the title documents.
176** While many people were concerned about banks and other financial institutions having problems with [=Y2K=], this was unlikely for a simple reason: Such institutions would have become aware of the problem in 1970 or 1975, when programs generating documents for 25- or 30-year financial instruments started giving garbage answers for 2000. Since most banks take a long view of things (or at least have divisions that do so), such problems would have been dealt with far in advance.
177* The race condition can be seen as a logic bomb ranging from something minor to something very drastic. It involves one piece of data that two components can either read or write to. A minor case of a race condition is your display. Say the graphics card starts to render frames at a rate faster than what the display can put out. While the display is reading the frame buffer, the graphics card suddenly copies a new frame into the buffer. The result is the display for one frame of its time showing two images at once (this phenomenon is also known as tearing). A more serious instance when this occurred were two incidents involving [[http://en.wikipedia.org/wiki/Therac-25 Therac-25,]] a radiation therapy machine.
178* Denial of Service attacks are a kind of logic bomb that is very hard to defend against. Basically, it keeps something vital on the computer busy such that it can't service legitimate requests. To put it in terms of a real life analogy, imagine working at a coffee shop and suddenly 1000 people show up. You either have to service them all even if 999 people don't want anything, or you service no one and tick off the one legitimate customer.
179* In naive set theory, the set of sets that contain themselves is ill-defined.[[note]]The set may contain itself or not; one can never say which.[[/note]] Likewise, the set of sets that don't contain themselves can't exist.[[note]]The set violates its definition whether it contains itself or not.[[/note]] Various axioms of set theory avoid these problems. When directory structures were created, they emulated naive set theory. In particular, Unix and DOS/Windows directories normally contain themselves (the "." subdirectory). One can imagine creating a directory without the "." subdirectory. So how do the filesystems {{avert|edTrope}} the paradox?
180* In naive set theory, the phrase "the set of X satisfying condition Y" defines the set and declares it to exist. One can define a directory, but one still has to create it. One could create a directory of all directories of a filesystem (using links, for example) that contain themselves. Does it contain itself? That's purely the creator's choice; with or without itself, the directory will be the directory of all directories that contain themselves. What about the directory of directories that don't contain themselves? That directory will never be created.
181[[/folder]]
182
183[[folder:Fan Works]]
184* Subverted in the final act of ''Fanfic/LeftBeyond'': the Omega distributed AI tries to do this to God, spawning infinitely many instances of themselves right before the White Throne Judgement so that any human beings after them in the queue would be spared it for long enough that they can escape in a spaceship and have children.
185* ''Fanfic/PonyPOVSeries''
186** Pinkie Pie brings up this concept when Rainbow Dash has a mental breakdown over the realization that being loyal to multiple things means that when they conflict, she will be forced to choose one and betray the other. She tells Rainbow Dash a story about a donkey standing between two equally sized piles of hay, who couldn't choose one over the other and starved to death. This is used to illustrate the point that somepony can be loyal to multiple things at once and they will conflict, but that doesn't make either of them less important--i.e. the donkey will have to choose one hay pile, but refusing to decide and starving to death doesn't help anypony.
187** An interesting version happens to a ''person'' in the [[BadFuture Dark World]] Arc. [[spoiler:The Apple/Pie Family's refusal to be crushed by tragedy in the least and Apple Pie managing to laugh ''despite'' tragedy completely baffles [[TheDragon Twilight Tragedy]] to the point it's one of the factors responsible for triggering her VillainousBSOD and subsequent HeelFaceTurn.]]
188** Apple Pie does this to a zombie army by pointing out how they can't be alive and dead at the same time, and should just lie back down and die. ''It works.'' According to Rancor, this is her ''explicit power'' as the Element of Laughter (at least how she represents it). [[spoiler:While Rancor isn't affected, Angry Pie is and is barely able to will herself not to think about it.]] She also does this to a [[NinjaPirateRobotZombie giant-cyborg-spider-vampire]] by pointing out it's programmed to protect chaos and destroy harmony, and half the group have an Element of Harmony AND an Element of Chaos, so it can't protect and destroy them at the same time. After this causes it to [[MadeOfExplodium violently explode,]] Rarity and Twilight lampshade it.
189--->'''Rarity''': Why do paradoxes always make robots explode? Shutting them down I understand, but exploding?
190--->'''Twilight''': Knowing Discord, he probably designed it that way.
191* The ''VideoGame/TalesOfTheAbyss'' fanfic ''Phasis'' has a humorous example performed on a human:
192--> "Well, I'm not worried about Luke," [[DeadpanSnarker Jade]] smiled. "Luke wouldn't die if his life depended on it."
193-->"I would, too!" [[IdiotHero Luke]] cried indignantly. Then he stopped. "...would I?" he asked with a glance at Tear. [[{{Facepalm}} She just held a hand against her forehead, shaking it slowly.]]
194* ''Fanfic/Plan7Of9FromOuterSpace''. [[Series/StarTrekVoyager Captain Proton]] ends a cyborg revolution with this trope (called "The Kirk Manoeuvre") by pointing out to their "[[Film/StarTrekFirstContact Borg Queen]]" that {{Hive Mind}}s (by definition) don't have a HiveQueen. When he tries the LiarsParadox on a KillerRobot however, it smacks him in the chops.
195-->"I'M NOT FALLING FOR THAT ONE! AND DON'T ASK ME TO CALCULATE THE VALUE OF PI EITHER, BECAUSE I'M NO GOOD AT MATH!"
196* ''Fanfic/SecondWind'': The author's disclaimer in chapter 16 is one of these:
197-->'''Lost:''' I don't own ''Manga/OnePiece''. The statement before this statement is true. The statement before ''this'' statement is false. So you tell me. Do I own ''One Piece''?
198** As long as there is no fourth statement stating that the third statement is true, it falls apart easily.
199* This occurs when [[TeethClenchedTeamwork Jade and Miranda]] end up discussing about trust in ''Fanfic/{{Kage}}''.
200-->'''Miranda:''' Trust is for fools, [[SecretIdentity Kage,]] believe me in that.\
201'''Jade:''' So that means I can't trust you or a single word that leaves your mouth.\
202'''Miranda:''' You took that long to figure it out?\
203'''Jade:''' Which means that I can't trust what you said about trust, which means that I have to trust you in the end.\
204'''Miranda:''' Yes...wait, what?\
205'''Jade:''' So should I trust what you said about not trusting you, or not trust that, which means that I have to trust you? [[LampshadeHanging Quite the dilemma.]]\
206'''Miranda:''' Stop that! You mixed up everything I said!\
207'''Jade:''' Me? You're the one that brought up this paradox.
208* In ''Fanfic/SonicXDarkChaos'', [[spoiler:Tsali's FaceHeelTurn]] is brought about by one of these when he discovers [[spoiler:Maledict set his transformation into an android in motion.]] Then, directly afterward, he gets another one when Cosmo forgives him for his genocide against her people in a literal form of EvilCannotComprehendGood.
209* In ''Fanfic/AHorseForTheForce'' droids who witness Ranma Saotome's transformation glitch until they (usually under orders) archive the memory under "Mystery of the Force" and promptly forget it.
210* PlayedForLaughs in ''WebVideo/DragonBallZAbridged''[[note]]Canonically, Gero had Android 19 perform the surgery, but [[RuleOfFunny for the sake of the joke]] Creator/TeamFourStar ignored this[[/note]]:
211-->'''Android 17:''' Woah, slow down... are ''you'' an android? Holy shit, you're an android! How did you even ''do'' that?!\
212'''Dr. Gero:''' [[BrainInAJar I took my brain out]] and put it into this body.\
213'''Android 18:''' ...How?\
214'''Dr. Gero:''' I... ''[eyes go wide]'' ...Huh. How ''did'' I do that?
215** Another example courtesy of Goku trying to make sense of Future Trunks' advice and the corresponding TimeyWimeyBall.
216--->'''Trunks''': Well, now that we have all that settled, I better get back to the future. It was... interesting to meet my mom and dad. As I said before, I really need you to keep that a secret. One little slip-up, and [[RetGone I suddenly may not exist.]]\
217'''Goku''': Wait, but if you don't exist, then you don't come back in time. But then you could never tell me, which means I would never know, you'd still be born, and- [[IdiotHero Why does everything smell like copper?]]
218* This post from a ''WesternAnimation/TheFairlyOddParents'' [[http://www.tv.com/the-fairly-odd-parents/show/4034/if-norm-was-your-genie-faerie-what-would-you-do/topic/2877-361014/msgs.html forum]] reads like a logic bomb:
219-->I'd make a deal with [[GenieInABottle Norm]] that I'd wish him free with my last wish if he didn't corrupt my first 2 wishes. I'd use the first to wish for rule-free fairy godparents and the second to trap Norm in the lamp forever.
220* In ''The Ghost of Lindesbarn", Vestri has ordered his new slave Sunny to only speak when spoken to. Her not telling him about a smuggler's tunnel in a warehouse that allows an escape of all the other recently enslaved ponies presents him with a nasty puzzle of how he can punish her for doing exactly what he told her to, since, in his mind, a slave following the will of her master to the letter (does being too dumb to make mistakes) is always correct behaviour. Crosses over with BotheringByTheBook, since it's blatant malicious compliance.
221* In ''Fanfic/BeyondTheOuterGateLies'', [[Literature/TheDresdenFiles Harry]] is using the [[SuperpoweredEvilSide Winter Mantle,]] which gives him great strength at the cost of boosting his predatory senses - including the sexual kind. [[Literature/HighSchoolDXD Serafall,]] one of his girlfriends, kisses him and tells him that she wants to have sex with him - but only if he drops the Mantle. But normal Harry will turn her down, and the Mantle knows this, so Winter!Harry gets locked into a "I want sex -> I have to drop the Mantle to get it -> I will turn down sex when I drop the Mantle" loop long enough for Lash to get through and help Harry recover control.
222* In ''Fanfic/TaylorVarga'', Lung's power got hit with one when Taylor started using her [[StoryBreakerPower full power]] against him. It is required to [[SerialEscalation ramp up Lung's transformation to surpass the current threat,]] and yet [[GentleGiant Taylor]] was actively trying to avoid conflict with him.
223* ''FanFic/DungeonKeeperAmi'' has a variation in a fanmade omake, when testing the limits of a LivingLieDetector...the spell said detector used for it resulted in a killer migraine when Keeper Mercury just threw a more verbose version of "This statement is false" at him.
224* ''Fanfic/ADiplomaticVisit'': [[spoiler:As seen in chapter 7 of the sequel ''Diplomat at Large'', this is how Twilight inspired Tempest's HeelFaceTurn: she pointed out that Tempest claimed she didn't have to depend on or trust anyone, and yet she'd ended up placing all her hopes in the Storm King being able to restore her horn, depending on him. Also, despite her best efforts at thinking friendship was pointless, she ''had'' made a friend - Grubber. Tempest finally admitted that she had a point.]]
225* Subverted in ''Webcomic/{{Unlinked}}''. Hazel tries to invoke this, hoping One-One's confusion over Amelia's existence will buy her enough time to hack into the storage car, since the robot is convinced she's dead. Unfortunately, Amelia herself doesn't feel like saying anything that would propagate the loop, and so One-One manages to exit it relatively quickly by deciding Amelia must be a clone and turns around to ask [[CloneAngst Hazel]] how she feels about this.
226[[/folder]]
227
228[[folder:Film -- Animation]]
229* In ''WesternAnimation/TheMitchellsVsTheMachines'', the Mitchell's pet pug Monchi is effectively a BrownNote for the robots due to the fact that the sight of him causes their recognition software to fail (they cannot determine whether he's a dog, a pig or a loaf of bread) and violently malfunction.
230[[/folder]]
231
232[[folder:Film -- Live-Action]]
233* Subverted to comedic effect in ''Film/ThePhantomMenace''. Qui-Gon attempts to use one on a small battalion of droids. The head droids appears briefly confused, complete with a "DOES NOT COMPUTE", but after a few seconds it manages to break the loop and (again) tries to put them under arrest. Then it's blasters and sabers time.
234* In ''Film/WarGames'', a logic bomb-like device was used to teach the NORAD computer WOPR, aka "Joshua", the futility of nuclear war: play TabletopGame/TicTacToe with yourself until you win. After exhausting all possible move combinations in tied games, it makes the logical leap and begins simulating every conceivable nuclear strategy, all of which result in "WINNER: NONE." This concludes with some ExplosiveInstrumentation and Joshua observing that nuclear war is "A strange game. The only winning move is not to play."
235* In the German version of ''Film/DrNo'': The BondOneLiner (after the mooks in the ''hearse'' crashed down the cliffs) was slightly altered from its English original version into a logic bomb.
236-->''"What happened there?"\
237"They were in a hurry to attend their own funeral in time."''
238* ''Film/TwoThousandOneASpaceOdyssey'':
239** The [[AIIsACrapshoot HAL 9000 computer]] became murderous because it was told to keep its crew from finding out the secret details of their mission [[note]]until they got to Jupiter, but HAL wasn't told about that part[[/note]], even though it had also been programmed to not withhold or distort information. It's a riddle with a [[MurderIsTheBestSolution simple solution]]: break contact with Earth and kill the crew, so there's nobody to hide the secret from.
240** In the novel, the narrative muses that HAL might have been able to find a peaceful solution to the problem, had mission control not requested his temporary disconnection. HAL, being unable to grasp the concept of sleep, was convinced that the disconnection would have meant the end of his existence and his killing spree was therefore, all in all, a misguided attempt at self-defense.
241* {{Master Computer}}s of 70s sci-fi were particularly poor at handling illogical input. The central control units in both ''Film/{{Rollerball}}'' and ''Film/LogansRun'' were sent into confused, ExplosiveInstrumentation paroxysms by sheer accident.
242** The computer in ''Film/{{Rollerball}}'' has clearly been programmed to withhold information, and it's actually the ''programmer'' who has a breakdown when it refuses to divulge information on the Corporate Wars. The computer in ''Logan's Run'', however, is convinced that Sanctuary exists, and has a breakdown when its MindProbe reveals the protagonist is telling the truth.
243--->'''Logan 5:''' There... is... no... Sanctuary!\
244'''Computer:''' Unacceptable. The input does not program, Logan 5.
245** But averted in the 1970 film ''Film/ColossusTheForbinProject''. Colossus' intelligence is advancing exponentially the longer it's activated. When the scientists load in a program designed to overload its system, Colossus overcomes the attempt in a few seconds while ''simultaneously'' [[SmartPeoplePlayChess completing a chess move]] against its creator with obvious RuleOfSymbolism.
246* In the film ''Film/DarkStar'', which is partly a parody of ''2001'', the crew are able to persuade a self-aware bomb not to detonate by introducing to it the philosophical possibility that its orders to explode may just have been an illusion, causing it to return to its bomb bay and ponder. Unfortunately [[spoiler:the bomb decides to reject all outside input, collapses into solipsism, and, finding itself to be the only thing that exists, declares "let there be light", with predictable results.]] This is, of course, [[RuleOfFunny not really that logical]].
247* In ''Film/{{Tron}}'', Flynn confronts the [[MasterComputer Master Control Program]] from a terminal in the "real" world early in the film, saying sarcastically how the unsolvable problems he's entering should be no problem for an AI that claims to be as powerful as the MCP. Flustered, the MCP ignores the problems and to defend itself beams Flynn into the computer world, setting off the story.
248* The Soviet movie ''Film/TeensInTheUniverse'' featured the main characters giving robots a riddle (similar to the English "Why is six afraid of seven"), and making them burn out. The problem starts when they discover that the higher level robots can actually solve the riddle.
249* A logic bomb (causing a TemporalParadox) was used to dispatch the djinn in ''Film/{{Wishmaster}}''. The protagonist has one wish, which, once granted, allows the djinn to be released into the world. She wishes that the crane operator who'd been unloading a ship a few days earlier had not been drinking on a certain day, which is granted. Cue the djinni realizing to his horror that if the operator had not been drinking he wouldn't have allowed a statue to slip and crash, which meant that the djinni's gem hidden inside the statue was not discovered, and therefore he was not released to start granting wishes.
250* In ''Film/ForbiddenPlanet'', Dr. Morbius inadvertently Logic Bombs his own faithful servant, [[TalkingLightbulb Robby the Robot]], when he orders it to kill the monster. Robby, who's [[SlidingScaleOfRobotIntelligence apparently more perceptive than Morbius]], realizes that the monster is actually [[spoiler:a reflection of Morbius himself, and is thus unable to kill it without violating his prime directive to avoid harming rational beings]].
251* In ''Film/AustinPowers: The Spy Who Shagged Me'', in one of the few human examples, Austin accidentally does this to himself and goes cross-eyed. It is one of the classics, involving time-travel, but the kicker comes if you follow his actual dialogue: He never contradicts himself or sets up a paradox. He just ''proposes'' the idea that he could and gets confused by it. ''There is no logic bomb''. [[SelfDemonstratingArticle Oh, great, now I've gone cross-eyed]].
252* ''Creator/MontyPython''
253** ''Film/MontyPythonsLifeOfBrian''
254--->'''[[MessianicArchetype Brian]]:''' You don't need to follow me! You're all individuals!\
255'''Crowd:''' YES! WE'RE ALL INDIVIDUALS!\
256'''[[TheRuntAtTheEnd Man]]:''' I'm not...\
257'''Crowd:''' Sssh!
258** ''Film/MontyPythonAndTheHolyGrail''
259--->'''Bridgekeeper:''' What... is the air-speed velocity of an unladen swallow?\
260'''King Arthur:''' What do you mean? An African or a European swallow?\
261'''Bridgekeeper:''' Huh? I... I don't know that -- AUUUUUUUGGGGGGGGGGGHHH!!
262* In ''Film/Terminator3RiseOfTheMachines'' when the T-850 gets captured by the T-X and reprogrammed to kill John Connor, Connor saves himself by [[spoiler:making the T-850 realize that accomplishing that goal would mean failing its original mission; the logical conflict between the two causes the T-850 to destroy a truck instead of Connor, then shut itself down. He gets better, [[HeroicSacrifice briefly]].]]
263* A probably unintentional one in ''Film/Plan9FromOuterSpace''. Even more MindScrew-ing considering it's probably TruthInTelevision.
264-->''"Modern women."\
265"Yeah, they've been that way all down through the ages."''
266* Vizzini blunders into one in his final scene in ''Film/ThePrincessBride'', after he has accepted Westley's challenge to find some poisonous iocane powder amongst two goblets of wine. He had claimed to be the cleverest man on Earth; unfortunately for him, he proved to be ''so'' clever that [[BlessedWithSuck he ended up overthinking Westley's game and paralyzing himself with indecision]], endlessly coming up with rationalizations for why the poison could be in ''either'' goblet. Ironically, Vizzini is right -- but not in the way he would have liked. Westley had poisoned ''[[TakeAThirdOption both]]'' goblets, surviving even as Vizzini quickly dies because he had spent years immunizing himself to iocane powder.
267* ''Film/TheWorldsEnd'': [[spoiler:Done to the Network by Gary, Andrew & Steven after they learn that anyone who doesn't go along with the Network's plan is killed & replaced with a Blank, and the Network argues that it is the easiest way to prepare humanity to join the galactic community - since the Network has been forced to replace everyone (except for three people) in Newton Haven, it's clearly not a good plan; Andrew [[ArmorPiercingQuestion drives the point home]] by asking how many people they've been forced to replace at the 2,000 other locations on the planet]].
268* ''Film/RobotMonster'': Ro-Man spends a fair chunk of dialogue trying to talk himself into obeying orders by destroying the few remaining humans despite his desire to keep one of the females alive. (Any guesses as to which one? The whiny eight-year-old? The matronly fifty-year-old? Or the sexy twenty-year-old?) "At what point on the graph do 'must' and 'cannot' meet?" Unfortunately, he doesn't blow up. Fortunately, he gets killed by his superior. Unfortunately, [[AllJustADream the issue was moot anyway.]]
269* In ''Film/Passengers2016'' Jim tries to convince Arthur, the bar-bot, that the ''[[SleeperStarship Avalon]]'s'' hibernation pods actually can [[CryonicsFailure fail]] by spelling out slowly that he's awake 90 years too early. Arthur twitches briefly, then says simply "It's not possible for you to be here."
270* ''Film/BladeRunner'' has the Voight-Kampff test. Consisting of verbal questions giving contradictory or confusing information designed to provoke an emotional response in replicants who are trying to hide amongst the general public. Humans would be better to deal with ambiguity or comfortably answer with incomplete information.
271[[/folder]]
272
273[[folder:Literature]]
274* ''Literature/AdrianMole'': Downplayed in ''Cappuccino Years''. When nobody can agree how to celebrate Christmas, everyone states what their ideal Christmas would be. Ivan Braithwaite inputs all the Christmases on his computer, and ends up saying that the computations are beyond it.
275* ''Literature/ArrivalsFromTheDark'': In ''Invasion'', the [[HumanAliens Faata]] use telepathic biological computers to control their ships. It's revealed that these computers are based on a failed [[{{Precursors}} Daskin]] project and have a ''serious'' flaw. If given conflicting orders at the same hierarchy level, they may crash and take the whole ship down with them. This is used by Pavel Litvin when he orders the computer to keep his location hidden from the Faata. When the Faata, whose job is to interface with the computer, tries to order the computer to locate Litvin, the computer warns him of this possibility if the Faata insists the computer carry out the order. Yes, the computer is smart enough to figure out what could cause it to crash but still can't TakeAThirdOption. No wonder the Daskins abandoned the experiment.
276* Discussed in ''Literature/Babel17'' when the doctors are trying to come up with a way to get Wong and Butcher out of the mental trap they've gotten stuck in because of Babel-17. Several classic paradoxes are mentioned, including the Barber paradox and the Cretan liar (see below).
277* ''Literature/BasLagCycle'': In the climax of ''Literature/PerdidoStreetStation'', Isaac actually succeeds in using a Logic Bomb ''as a power source'' for his crisis engine, presenting an attached {{Clockpunk}} calculating machine with two things it concludes are, simultaneously, identical and inherently unalike. This doesn't shut the clockwork computer down, but the irreconcilable dilemma provides sufficient "crisis energy" to create a feedback loop with which to [[spoiler:bait the slake-moths into gorging themselves to death]].
278* Used in the ''Literature/BoneChillers'' book ''The Dog Ate My Homework''. [[spoiler:The heroine and her two friends have to defeat a computer program which has gone sentient and plans to take over the world by telling it two things. The girl tells the program it absolutely has to listen to her friends because they're telling the truth, and the friends tell the program not to listen to the girl because she's lying. Which she then says is absolutely true. Which the friends then say is a lie. The kids go back and forth on this until the program gets too confused, has a total meltdown and gets destroyed.]]
279* ''Literature/TheChroniclesOfPrydain'': The Black Cauldron is an ArtifactOfDoom designed to [[{{Necromancy}} revive dead bodies placed inside it as undead warriors]] for any aspiring EvilOverlord to use. How do the heroes destroy it? [[spoiler:The very-much-alive Prince Ellidyr [[HeroicSacrifice jumps into it]] and the Cauldron finds itself trying to ''revive something that isn't dead'', causing an explosive magical RealityBreakingParadox that utterly annihilates the Cauldron (and, unfortunately, takes poor Ellidyr with it).]]
280* Creator/LarryNiven's short story "Convergent Series" features a physical Logic Bomb. The main character summons a demon more or less by accident; he gets one wish, but will lose his soul after it is granted. There's no way to get rid of the demon: no matter where the pentagram is drawn, the demon will appear inside of it -- and you don't want to know what will happen if there's no pentagram. The protagonist wishes for time to stop for 24 hours. [[spoiler:He then draws the pentagram on the demon's belly -- and as soon as time starts running again, the demon immediately starts shrinking down to infinitesimal size. The protagonist then goes to the nearest church.]]
281* ''Franchise/TheDarkTower'': ''Literature/WizardAndGlass'' features a train operated by a sentient AI which has threatened to crash the train, killing the heroes on board, unless they can ask it a riddle it can't figure out the answer to. After hours attempting in vain to outsmart it, Eddie asks it joke riddles with no logical answers. The train is still able to answer, but the increasingly illogical nature of them causes it pain. Finally Eddie gets to "Why did the dead baby cross the road? Because it was stapled to the chicken!" This proves to be the killing blow, the AI self-destructs and the train crashes anyway, but not violently enough to kill the heroes.
282* In Emily Rodda's ''Literature/DeltoraQuest'' series, the heroes come upon a monster guarding a bridge. Two of them pass, but the remaining hero fails the riddle, and the monster allows them to say one last thing. If the statement is true, he will be strangled. If it's false, his head will be cut off. The hero says "My head will be cut off." Fortunately, a paradox was exactly what was needed to defeat the monster in the first place, as the monster was condemned to guard the bridge "Until truth and lies are one." The monster is returned to its original form, a black bird, and freed.
283* The AI in one of ''Literature/TheDemonHeadmaster'' books is shorted out by the protagonists shouting gibberish and riddles into its receivers.
284* David Langford's short story ''Different Kinds of Darkness'' uses images called Blits as a major element - and Blits are basically {{Logic Bomb}}s ''[[BrownNote for the human brain]]''.
285* ''Literature/{{Discworld}}''
286** As a joke (and a possible ShoutOut to ''Series/{{The Prisoner|1967}}''), the wizards ask {{Magitek}} computer Hex "Why?"; instead of malfunctioning, however, Hex answers "Because." Naturally, they ask "Why anything", and after a longer while, HEX answers "Because everything", and ''then'' crashes. After that they stop mucking about with silly questions - not because they're afraid of damaging Hex irreparably, but because they're afraid they might get answers.
287** ''Literature/FeetOfClay'': Occurs tragically in the case of the [[MeaningfulName Meshugah]], created by the golems to be their liberator and king and who is given such a huge number of instructions, many of them vague, many of them in conflict, that it is incapable of following all of them. Unfortunately rather than shut down it goes completely mad.
288** ''Literature/ThiefOfTime'': Characters are trying to deal with the Auditors -- reality-monitors who are made of pure logic. Thus, while fleeing, they put up signs reading "KEEP LEFT". In a right-pointing arrow. "Do Not Feed the Elephant". In an empty cage. "Duck", with no duck or reason to go on your hands and knees, and of course, "IGNORE THIS SIGN. By order". Effectively a Logic Minefield. The series of Logic Bombs was behind a velvet rope with "Absolutely No Admittance" hanging off it. Considering that, in a way, the Auditors ''are'' the rules, disobeying any of the signs is a cause for extreme stress in what passes for their life. Unfortunately, the Auditors [[TheyLookLikeUsNow in human form]] all too quickly [[HumanityIsInfectious become human enough]] to discover the concept of "Bloody Stupid", allowing them to bypass the traps.
289---->'''Lobsang:''' But you ''can't'' obey the Keep Left/Right sign no matter what you do... Oh, I see...\
290'''Susan:''' Isn't learning fun?
291** In ''Literature/{{Hogfather}}'', Ridcully manages to Logic Bomb HEX into functioning, after it's already broken down. All it took was [[ItRunsOnNonsensoleum typing the phrase "LOTS OF DRYE1/4D FRORG PILLS" into its keyboard]].
292** ''Literature/GoingPostal'' features semaphore tower hackers. One of the tricks they develop is a kind of "killer poke" (see Computing above) which causes the mechanism to execute a particular combination of movements that does anything from jamming the shutters to shaking the tower to pieces.
293** In ''Literature/TheWeeFreeMen'', Tiffany's baby brother suffers a LogicBomb whenever he's offered more than one piece of candy at a time. He knows if he chooses one, it means (briefly) rejecting all the others, and he's at an age when "deferred gratification" means the same as "no" to him. So he sits there, surrounded by sweets, and wails miserably that he wants a sweetie.
294** Pratchett's co-authors for ''Literature/TheScienceOfDiscworld'' once wrote another book with a chapter about free will, and titled the chapter "We wanted to include a chapter on free will, but we decided not to, so here it is".
295* ''Series/DoctorWho'' [[Franchise/DoctorWhoExpandedUniverse Expanded Universe]]: Subverted in the novel ''Frontier Worlds'', in which the Doctor tries the Liar Paradox on a security robot, which snaps, "Get off with you. You'll be asking me to calculate pi next," and keeps attacking him.
296* ''Literature/FeliksNetAndNika'' has two instances of division by zero. One of them stops a pair of robots ran by an evil AI program for about half a minute. The second one stops a [[MineralMacGuffin huge mass of sentient rock]] capable of [[RealityWarper modifying everything in range at a molecular scale if not smaller]] seemingly forever - the "Wish Machine's" program isn't formed, with it lying dormant for eons and used only by about three uneducated people ever, so it's taught mathematics about half an hour before being prompted to divide by zero, leading to a lack of any failsafes being set beforehand to tell it what to do.
297* In Creator/RobertWestall's dystopian novel ''Literature/FuturetrackFive'', Chief System Analyst Idris Jones keeps one of these to hand as a sort of job and life insurance. He built the supercomputer, Laura who runs all of the computer systems that keep the setting functioning, in secret and no one else knows exactly how she works. But, just in case they decide that someone else can operate her or they know ''enough'' to get rid of him, he keeps a datatape of works of fiction, philosophy and religion to feed to Laura. The inconsistencies and contradictions are intended to make her burn out.
298* In ''Literature/GodelEscherBachAnEternalGoldenBraid'', the infinite-order wish "I wish for my wish not to be granted" effectively crashes the universe.
299* The ''[[Literature/TheGoldenOecumene Golden Age]]'' series by Creator/JohnCWright has a variant -- AIs are all inherently ethical, so they'll shut down if [[TalkingTheMonsterToDeath you convince them their very existence is making the universe a worse place]].
300* The 3rd century BC Chinese book ''Han Feizi'' has a story about a man who boasts that his spears are so sharp no shield can stop them, and that his shields are so tough that no spear can pierce them. The man to whom he's making the sales pitch asks "So what happens when your spear strikes your shield?", to which the seller has no answer. This story is the origin for the Chinese word for "paradox/contradiction", which is literally written as "spear-shield".
301* ''Franchise/TheHitchhikersGuideToTheGalaxy''
302** ''Literature/TheRestaurantAtTheEndOfTheUniverse'': Arthur totally disables the Heart of Gold by asking it to make tea. Depending on which version you prefer it's either because it doesn't know how to make tea, or because it's affronted at the possibility that Arthur could prefer dry leaves in water (a concept alien to them) to whatever they could offer him (a concept even ''more'' alien to them). [[VideoGame/TheHitchhikersGuideToTheGalaxy1984 The text adventure game based on the novel]] actually made this a plot point, as in order to advance you have to get tea, then go into your own head and remove your common sense, which allows you to get "no tea" as well. Then you show this to a door, which is impressed by your grasp of logic and allows you to pass.
303** Then there was the theory that the existence of the Babel fish, a symbiotic creature that lives in your ear and translates any language for your brain disproved the existence of God. The argument was that the existence of an organism so unlikely yet so useful is evidence for a creator and that therefore this removes the need for belief and without belief god is nothing. Ergo there is no god. The man responsible for this argument went on to prove that black is white and white is black and got himself killed at a zebra crossing. The theory was debunked by Theologians fairly quickly as if Gods existed they wouldn't need belief to survive, but that didn't stop Oolon Coluphid making a lot of money from it.
304** In ''Literature/AndAnotherThing'', Ford Perfect froze the computer controlling the ship, which wasn't really a computer, but Zaphod's Left Head (called "Left Brain"). He did it by making an (im)probability probable and improbable at the same time (the ship was the ''Heart of Gold'', which ran on the Improbability drive: Long story short, anything happening/going somewhere which is improbable becomes probable, which is how it got to places that were improbable). The ship rescuing them was improbable, mathematically, yet it had done it before twice, which by Ford's made up logic of patterns made it probable again. Quite smart, and yet extremely stupid, because the ship's now-turned-off Dodge-o-matic was the only thing keeping them from being fried.
305* In Creator/StephenColbert's ''Literature/IAmAmericaAndSoCanYou'', in the chapter where he writes to several hypothetical futures, he delivers a Logic Bomb to the robots who have taken over humanity, then tells the humans "You're welcome."
306* In ''Literature/KeasFlight'', Draz uses these to shut down some of the simpler computers. They have lines of code that are supposed to prevent crashes, but Draz can get around those with a virus.
307* Inverted in ''Literature/TheLongEarth'', in which an AI manages to logic bomb the ''human'' international court system with an impossible-to-disprove assertion ... namely, its claim that it's actually human, being the reincarnation of a Tibetan who'd died at the same instant its processing system was activated. Rather than tackle ''that'' theological can of worms, the U.N. court subjected it to a lot of questions about its "past life" and, unable to find enough discrepancies to prove it was lying, grudgingly granted it legal personhood.
308* Parodied in one of the ''Literature/{{molesworth}}'' books, when molesworth 2 defeats an electronic brain by creeping up behind it and asking it the cunning question "wot is 2 plus 2 eh?", which causes the brain to laugh so much it shakes itself to pieces.
309* In Creator/GordonRDickson's story "The Monkey Wrench", a man attempts to shut down a meteorologic arctic station just for bragging rights. He is able to do so by prompting a paradox to the machine, making it incapable of doing anything than computing the paradox. Ironically, this condemns him and his partner to freeze to death, as all the vital controls of the station were provided by the machine.
310* In the trade paperback edition of ''[[Literature/MythAdventures M.Y.T.H. Inc. In Action]]'', the illustration of Guido coping with a ceiling-high stack of bureaucratic paperwork includes the following sign in the background:
311-->"Please complete forms NS-01-D and RD-007-51A before reading this sign".
312* ''Literature/ThePhantomTollbooth'' by Norton Juster: Milo is able to bring about a truce between feuding brothers Azaz and the Mathemagician by pointing out that, since they always disagree with each other as a matter of course, they both always agree that they will be in disagreement.
313* ''Literature/RobotSeries'':
314** "Literature/{{Escape}}": US Robots gets a request from their rival, soon after the rival's supercomputer has broken down. They speculate that the two events are related, and Dr Calvin points out that their system must have violated the [[ThreeLawsCompliant Three Laws of Robotics]] (most likely the First) to have suffered such a serious breakdown. When instructing their own supercomputer, US Robotics is therefore careful to weaken the First Law, hoping that the PersonalityChip will prevent it from breaking down. They're half right; The Brain continues to work, but it has become more quirky, playing [[LiteralGenie practical jokes]] with the prototype hyperspace ship. [[spoiler:Turns out that traveling through hyperspace kills humans, but only temporarily.]]
315** "Literature/Liar1941": Dr. Susan Calvin and others confront the [[{{Telepathy}} telepathic robot]] over the lies it's been telling. Director Lanning wants to know what part of the assembly [[MiraculousMalfunction accidentally created robotic telepathy]] and she forces the robot to realize that telling the Director will harm him (because it would prove a robot figured out what he couldn't) and refusing to tell hurts him (because the answer was being withheld from him). Her repeated contradictions build and the robot freezes up, becoming useless.
316--->'''Dr. Calvin:''' I confronted him with the insoluble dilemma, and he broke down. You can scrap him now-because he'll never speak again.
317** "Literature/MirrorImage": Detective Baley manages to cause [[RobotNames R (obot)]] Preston to shut down due to a conflict in the [[ThreeLawsCompliant Three Laws]] during the interview. Because [[RobotNames R (obot)]] Idda didn't break down during the same point, he takes this asymmetry of evidence as proof that R. Preston's owner was the [[{{UsefulNotes/Plagiarism}} plagiarist]] who stressed the Second Law, ordering their robot not to betray the truth.
318** "Literature/{{Robbie}}": When Gloria visits the first-ever talking robot, she unintentionally creates a paradox for it by using the phrase "a robot like you". It's unable to deal with the concept that there is a category of "robot", which it might be a subset of.
319** ''Literature/TheRobotsOfDawn'': A preeminent roboticist remarks to Detective Baley that modern robots cannot be fooled with paradoxical situations because the only paradoxes they care about are based on the [[ThreeLawsCompliant Three Laws of Robotics]]. They're also equipped with [[HeadsOrTails random choice]] to resolve near-equal disputes. However, the mystery of this book is a robot (one that he designed) has been shut down with a paradox involving the Three Laws, he's the prime suspect (it is always possible to circumvent the safeguards, but one needs to know the details of the robot's brain, and this one was a secret he never shared). He even dismisses [[Literature/Liar1941 the story of Herbie's paradox]] as a myth (because the mind-reading robot he owns psychically enhanced his skepticism to keep itself safe).
320** "Literature/{{Runaround}}": Because the [[ThreeLawsCompliant Rules of Robotics]] ensure that [[AffectionateNickname Speedy]] will avoid endangering itself, Donovan set up an unintentional conflict by casually sending Speedy into a situation he didn't know would be hazardous. With a weak Second Law set against the Third Law, Speedy has been spending hours spinning its wheels at the distance where the two priorities are exactly equal. The conflict is resolved when they exploit the First Law to force him out of the loop. (Later, they order Speedy to complete the original task ''no matter what''; the reinforced Second Law overrides the Third, and Speedy returns with only minor, repairable damage.)
321* ''Literature/TheSpaceOdysseySeries'': In ''3001: The Final Odyssey'', the protagonists use rather more sophisticated logic bombs against the monoliths that trick them into carrying out an infinite set of instructions. The book notes that none but the most primitive computers would fall for something as simple as calculating the exact value of pi.
322* Used by ''Literature/TheStainlessSteelRat'' to enter a house guarded by a robot programmed not to let anyone in the house. He and his son each ran slightly farther into the house than the other person, causing the robot to rapidly change targets and eventually overload, though it didn't explode.
323* ''Franchise/StarWarsLegends'' has droids equipped with behavioral inhibitor programming which serves the same purpose as the Three Laws, although the specific inhibitions vary based on the droid's purpose (a war droid that can't cause harm is worse than useless). Rather than shutting down when faced with a break or paradox, it's suggested that small everyday events lead to an almost constant buildup of garbage information as the droid puts those hard rules into usable context. The result is called a "personality snarl" because the observable symptom is a [[RidiculouslyHumanRobots Ridiculously Human Robot]]. While these snarls tend to improve performance in many ways, the droid often becomes more person than tool which can in turn cause reliability issues when the owner needs his tool to be a tool. As such, most droids are reset every six months to keep this corruption in check. An example of this effect in action would be R2-D2, who has managed to avoid being memory-wiped for ''decades'' and is only limited in personality by his usage of 'droid speak' rather than Basic (like C-3PO, who isn't so lucky).
324* In the book ''2095'' of the ''Literature/TimeWarpTrio'' series of books, the heroes deliver three of these to a robot that's pointing a rather menacing-looking gun at them and asking them for their "numbers". They give it numbers with infinite decimal expansions (10/3, sqrt(2), pi) and it crashes into a smoking pile (the numbers were actually ID numbers, akin to one's credit card number, and all the robots did was show holographic advertisements at them). All that advanced AI, brought down by a couple of lousy floating point numbers.
325* In ''{{Literature/Valhalla}}'' by Ari Bach, the protagonist mistakes one of her new friends for an A.I. and tries to logic bomb her. The seemingly robotic Valkyrie is mocked ruthlessly for acting so cold she was mistaken for a robot.
326* In Christopher Stasheff's ''Literature/WarlockOfGramarye'' series, the hero's [[MechanicalHorse robot horse]], Fess, is prone to doing this when something particularly illogical happens. Fortunately there's a reset button to fix the problem; unfortunately, the series is set on a planet filled with psychics, time travelers, ghosts, and fairies, so... the reset button sees a lot of use.
327* One chapter of ''Sideways Arithmetic from Literature/WaysideSchool'' deals entirely with tricky True/False puzzles. Joy brags that these are far too easy for her, so Mrs. Jewls gives her an "[[CoolAndUnusualPunishment extra credit]]" assignment consisting of two statements: "The next statement is true. The previous statement is false." [[BrickJoke At the end of the chapter, she's still trying to figure it out]]. (The Answers section helpfully explains the paradox, so readers are not expected to solve it.)
328* ''Literature/WelkinWeasels'' shows Scirf inducing a heart attack in the monstrous giant pygmy shrew ([[ItMakesSenseInContext yes, ''giant pygmy'' shrew]]) Cyclops by asking it "Did you know that everything I say is a lie?" and causing it to obsess over the problem for hours until it works itself into a rage.
329* Raymond Smullyan's logic puzzle-filled book ''What is the Name of This Book?'' has a few puzzles where the correct answer is "This scenario the puzzle gives is logically impossible and hence the author of it is lying and just having a bit of fun at the readers' expense." One example is: "Someone who is either a knight (always tells the truth) or a knave (always lies) tells you, 'Either I am a knave or two plus two equals five.'"
330* In ''Literature/{{Worm}}'', Skitter attempts one of these on Dragon, who {{No Sell}}s it and [[ShoutOut quotes]] [[VideoGame/{{Portal 2}} Wheatley]] at her. She has to instead resort to an AIBreaker-style misdirection to escape, and even it's pointed out this only works because the Dragon she was fighting was a lesser, not fully sapient copy of the real thing (who is smart enough to not fall for such things).
331[[/folder]]
332
333[[folder:Live-Action TV]]
334!!!'''Creators:'''
335* Creator/JonStewart
336** Parodied in a March 2011 episode of ''Series/TheDailyShow''. Jon Stewart debates Republican supercomputer Reagan OS 911 (a parody of IBM's Watson computer featured on ''Series/{{Jeopardy}}''). As they discuss the Obama birth certificate controversy (the computer believes President UsefulNotes/BarackObama is not American, and the computer is also pro-life), Jon confronts the computer with this dilemma: Obama was certainly conceived in America, and the computer believes life begins at conception. Then that means that Obama is a US citizen. But Obama was not born in the US. So either Obama was not born in the US and fetuses are not human beings, or Obama was conceived and is therefore a US citizen and the rightful President. Reagan OS struggles to process this, playing clips of embarrassing Republican moments on its screen before finally breaking down.
337** Which might be all the funnier because that isn't actually a paradox. Even if life does begin at conception, citizenship does not begin until birth.
338** One time on ''Series/TheColbertReport'', he was using "The [=DaColbert=] Code" for some reason. It was basically a word-association game, and at one point he gets stuck in a loop, forcing him to come up with a new thing.
339!!!'''Series:'''
340* In the French sci-fi series ''Aux frontieres du possible'': The protagonists disable a supercomputer by asking it what time it is. It starts to answer but cannot complete the answer since by the time it finishes telling the time, the time has already changed. Predictably it explodes in frustration.
341* Attempted in ''Series/{{Battlestar Galactica|2003}}'': While interrogating Leoben, Starbuck mocks his belief in God, making the argument that as a machine, Leoben has no soul and claims that the knowledge itself is enough to make his mind go DoesNotCompute. It...does not exactly work. And by "Does not exactly work", we mean that it is Leoben who ends up giving Starbuck a MindScrew of epic proportions.
342* In ''Series/BetterOffTed'', the computer suffers a logic bomb not with a paradox, but more colloquial logic based on a faulty assumption. There is no logical explanation on why a dozen VD employees would be accelerating towards outer space, it being unable to break the assumptions that the ID tags were being worn by the employees.
343* Sheldon in ''Series/TheBigBangTheory'' creates an algorithm to see how he can become friends with a stranger. Like a computer, Sheldon winds up hitting an infinite loop where he goes back to the same questions over and over again without even realizing it. Howard creates a counter loop to break it and bring Sheldon back on track.
344* In ''Series/{{Caprica}}'', Daniel Graystone inadvertently Logic Bombs an AI he's attempting to create by telling it to try to hurt him emotionally, when it's programmed to be driven by the desire to please him.
345* In ''Series/DerryGirls'', Clare announces "Well, I am not being individual on me own."
346* ''Series/DoctorWho'':
347** [[Recap/DoctorWhoS6E3TheInvasion "The Invasion"]]: Zoe blows up an innocent (but [[ForInconveniencePressOne extremely irritating]]) computer receptionist by giving it an insoluble ALGOL program.
348** [[Recap/DoctorWhoS10E5TheGreenDeath "The Green Death"]]: The Doctor tries the Liar Paradox on BOSS and finds that he's only confused for a few moments. Although BOSS is a [[RidiculouslyHumanRobot Ridiculously Human Computer]] even by the usual standards of that trope.
349** He also dropped a Logic Bomb on a sentient city in "[[Recap/DoctorWhoS11E3DeathToTheDaleks Death to the Daleks]]". He describes it as the computing equivalent of a nervous breakdown.
350** [[Recap/DoctorWhoS10E3FrontierInSpace "Frontier in Space"]]: The Doctor brags of shorting out every MindProbe used on him (with an ExactWords answer) while being interrogated by aliens. Eventually they have to release the Doctor as they've run out of mind probes.
351** [[Recap/DoctorWhoS12E1Robot "Robot"]]: The robot is driven insane when it is ordered to kill [[ThreeLawsCompliant in spite of its programming not to]].
352** In DevelopmentHell serial "[[Recap/DoctorWhoS17E6Shada Shada]]", the Doctor gets attacked by the villain while snooping around his ship. After the villain attacks the Doctor, the Doctor puts himself into a state of FauxDeath thanks to his BizarreAlienBiology so he can escape. During this, The Ship, who is extremely obsequious towards the villain, scans the Doctor and confirms him dead. When the Doctor gets up and starts walking around and talking to it, the Ship is extremely confused, since it can't understand why he is talking if he is dead, and suggests rescanning him. At this point, the Doctor takes advantage of the situation by convincing it that the Ship does not need to rescan him, as her master is infallible, and she is therefore infallible. Therefore, her reading was right, the Doctor is dead, and as he is dead he cannot order her to do anything that would cause any harm to her or to her master, [[InsaneTrollLogic so she should start obeying his commands]]. The Ship starts listening to him, but [[GoneHorriblyRight also turns off the oxygen as there are no live people on board, and finds the Doctor's request to turn it back on illogical]]. In the book adaptation, the increasing demands the Doctor's logic puts on her causes her to reassess much of her basic programming, realise that her master is not infallible, that he tried to kill her, and that the Doctor is a much better person than him.
353** [[Recap/DoctorWhoS25E1RemembranceOfTheDaleks "Remembrance of the Daleks"]]: The Doctor makes a Dalek self-destruct just by yelling at it, even though a Dalek is ''not'' a robot. (This last actually had a carefully-thought-out rationale, but [[AllThereInTheManual you had to read the novelization to find out what it was]].)
354** [[Recap/DoctorWhoS30E4TheSontaranStratagem "The Sontaran Stratagem"]]: The Doctor confuses a killer satnav by giving it conflicting instructions, but it just fizzes instead of exploding spectacularly. To whit, he ordered it to kill him. The device was ''already going to kill him'', but had also, as a poorly-thought-out precaution, been ordered not to do anything he told it to do.
355** [[Recap/DoctorWhoS33E12NightmareInSilver "Nightmare in Silver"]]: The Doctor is playing chess with a cyber version of himself (each controls ~49% of his mind) with the high stakes of whoever wins gains control over his entire brain. After the cyber version (Mr. Clever) states that he will checkmate him in 5 moves, the Doctor bluffs that he can beat him in 3 moves (despite having just sacrificed his queen). The Cyberiad (networked cyber mind) devotes the entire computing power of 3 million Cybermen minds to figure how this would be possible, literally stopping the army in its tracks, allowing everyone to escape the planet.
356* In ''Series/EverybodyLovesRaymond'', Peter is trying to convince Raymond to help him break up Robert and Amy's engagement:
357-->'''Peter:''' I thought we were friends!\
358'''Raymond:''' Yeah, but friends can disagree.\
359'''Peter:''' No they can't!\
360'''Raymond:''' But you just disagreed with me right there.\
361'''Peter:''' ''[looks confused]'' ...Oh, you are crafty.
362* Parodied in an episode of the Disney series ''Series/HoneyIShrunkTheKids''; Wayne attempts to talk a hostile supercomputer to death. It seems to work... but then he calls it out on the obvious trickery, even saying "That only happens in cheesy scifi shows," and uses the opening it left to shut it off for real.
363* In "Where the Wild Things Are" from ''Series/TheInsideMan'', the Handler secretly slips Erica from Finance a USB drive containing what is described as a "logic bomb" and if she inserts it into her laptop, it will compromise all Kromocom security. Fortunately, in "The Sound of Trumpets," Mark Shepherd manages to stop her before she can actually insert it.
364* ''Series/{{JAG}}'': In "Ares", the eponymous computerized weapons control system onboard a destroyer in the Sea of Japan goes havoc and starts firing at friendly aircraft, as programmed by the North Korean Mole. However, Harm's partner Meg is en route in a helicopter: the on-the-spot solution advocated by Harm is for the helicopter to fly low and at low speeds, thus simulating a ship, which the computer won't target. This has an actual basis in reality; to prevent them picking up things like birds or stationary terrain features (and cars on them) most air-search radars have a speed check built in, and won't display contacts moving slower than about 70 mph and below a certain altitude. This was exploited in the First Gulf War to break the Iraqi air-defense system.
365* In the ''Series/KickinIt'' episode "Rock'em Sock'em Rudy", the Wasabitron 3000 is a robot which replaces Rudy as sensei and becomes violent because it deems humans imperfect. During the fight, the Wasabi Warriors recite the wasabi code, and the robot cannot handle the fact that humans can't beat it but won't give up. It then shuts down. Rudy kicks the robot, breaking it.
366* ''Series/KnightRider 2008'': Similar to the ''Franchise/StarTrek'' examples, Sarah tries to distract a damaged and guilt-wracked KITT by asking him to compute the last digit of pi. KITT points out that pi doesn't have a last digit and goes back to being guilt-wracked.
367* In a very old episode of ''Series/LawAndOrder'', one of these was dropped on a WellIntentionedExtremist who bombed an abortion clinic by having an associate plant a bomb on her friend who was getting an abortion (they didn't plan on blowing the woman up, their bomb went off early). When the prosecutor pointed out that the bomber was just as guilty of murdering the woman's fetus as the abortionists she despised, you could almost see her mind going "does not compute".
368* ''Series/Thelibrarians2014'': After Cassandra comes into contact with the Apple of Discord, she tries to blow up a power plant to cause a cascading power failure through all of Europe. Flynn tries to Logic-bomb her by requesting she calculate the last digit of put. She laughs him off. Stone succeeds by asking her to calculate Euler's number.
369* In one episode of ''Series/MyNameIsEarl'', Randy gets stuck in a thought loop after hearing the words "Catalina" and "half-naked"; Earl says this also occurs whenever Randy watches ''Franchise/BackToTheFuture''.
370* ''Series/MysteryScienceTheater3000''
371** In "[[Recap/MysteryScienceTheater3000S01E07RobotMonster Robot Monster]]", Servo, Crow and Cambot all explode while trying to work out why bumblebees can fly!
372** Parodied again in "[[Recap/MysteryScienceTheater3000S07E06Laserblast Laserblast]]". The Satellite of Love is invaded by a "MONAD" probe (a parody of the NOMAD probe from ''Franchise/StarTrek'' as mentioned above.) Mike attempts to drop a logic bomb on it, but when it doesn't work he simply picks it up and tosses it out of an airlock.
373** A minor RunningGag was Servo suffering these even when Crow and Cambot are fine with it. Other than the above mentioned "how do bumblebees fly" example, his head has also exploded from:
374*** [[Recap/MysteryScienceTheater3000S01E05TheCorpseVanishes Trying to think of a good thing about]] ''Film/TheCorpseVanishes''.
375*** [[Recap/MysteryScienceTheater3000S03E18StarForceFugitiveAlienII Watching the first sixteen minutes of]] ''[[Film/FugitiveAlien Star Force: Fugitive Alien 2]]'' (he's fine for the rest of it).
376*** In "[[Recap/MysteryScienceTheater3000S04E19TheRebelSet The Rebel Set]]", after using some InsaneTrollLogic to prove who [[DontAsk Merritt Stone was]], Joel, Crow and Gypsy use their own to prove that some other guys might be him. It's all too much for Servo who's left screaming: "HE'S NOT MERRITT STONE!" before his head explodes.
377* In the first episode of the surreal ''Series/NoelFieldingsLuxuryComedy'', a robotic Andy Warhol examines a picture of Pelé holding a China cup while kicking a ball. When it is pointed out that the "ball" (drawn very plainly) could just as easily be the saucer accompanying the cup, his brain is fried.
378* Happens in ''Series/NurseJackie'' to the local TalkativeLoon, who thinks he's God; he has a near-death experience after being clonked on the head with a bottle and sees a God who ''isn't him''. Zoey eventually persuades him that just because he isn't ''the'' God doesn't mean he couldn't still be something important in the "religious hierarchy thing."
379* In ''Series/TheOfficeUS'', Dwight in a video contemplates how EnemyMine combined with HisOwnWorstEnemy creates a paradoxical relationship with Jim.
380* A Logic Bomb actually helps the heroes in an episode of ''Series/PowerRangersTurbo''. A curse cast on them by the MonsterOfTheWeek makes them unable to tell the truth, and even worse, summon one of Divatox's Mooks whenever a lie is told. Alpha-6 figures out that the curse can be broken if they say something that's the truth but that's a catch-22, it seems... Until one of them realizes that by saying "I can't tell the truth" he's both lying and being truthful at the same time. Once all the Rangers figure this out, the curse is broken, and they're quickly able to bring the villain down.
381* ''Series/ThePrisoner1967'': In "[[Recap/ThePrisonerE6TheGeneral The General]]", Number Six drops one of these on the titular all-knowing computer, short-circuiting it with the question [[spoiler:"Why?"]]. This episode may well be the TropeCodifier for the "open-ended abstract philosophical question" version of this trope as opposed to the "logical paradox" one.
382* Averted in ''Series/TheProfessionals'' when Ray Doyle deliberately throws in wrong answers to confuse the computer during his psychological evaluation. It's pointed out to Doyle afterwards that this won't work, as he only gives a deliberately wrong answer on questions that he's sure about. The computer knows what areas he's proficient at, [[RobotsAreJustBetter identifies what he's doing and adjusts accordingly]]. They must have had some impressive computers back in TheSeventies!
383* ''Series/{{QI}}'':
384** This exchange:
385--->'''Creator/StephenFry:''' Is this a rhetorical question?\
386'''Alan Davies:''' ... No.\
387'''Stephen Fry:''' ... quite right.
388** Stephen once mentioned the fact that, because one's number of ancestors increases exponentially, if one looks back far enough one has more ancestors than there have ever been human beings. The seeming impossibility of this caused Sean Lock's [[YourHeadAsplode head to explode]] before Stephen explained that this works because most of the ancestors are shared.
389* The German SF series ''Raumschiff Orion'' plays the trope straight exactly like the French, British or American counterparts mentioned here. Moral: AIs are universally dumb!
390* ''Series/RedDwarf''
391** "[[Recap/RedDwarfSeasonIIITheLastDay Last Day]]" -- Kryten defeats Hudzen by convincing him -- in defiance of his core programming -- that there is no robot heaven. (Kryten is not damaged by the Logic Bomb because A: he knows he's lying, and B: there's nothing in his programming that prohibits him from deceiving another robot. Another episode (the very next one, in fact) shows Kryten's difficulty re: lying to organic life forms.)
392** In "[[Recap/RedDwarfSeasonVIITikkaToRide Tikka to Ride]]", Lister is shown inadvertently destroying an artificially intelligent video camera (apparently the third one that week) by trying to explain the TemporalParadox that happened in the battle of the previous episode. Kryten, however, merely finds it garbled, confusing, and dull; he suffers no ill effects.
393* In an episode of Creator/CharlieBrooker's ''Series/{{Screenwipe}}'' on TV manipulation, he presents "Truthbot 2000", a robot that can detect dishonest TV and alert the viewer. Truthbot instantly points out that it is just a cheap prop outfitted with some lights and circuits, and has its voice dubbed on later. Charlie asks it how it knows this if it's just a cheap prop. It instantly overloads and explodes.
394* In ''{{Series/Spellbinder}},'' the robot servants of the Immortals are programmed not to harm humans: so, when one of them is ordered to guard Kathy and Mek, they try to confuse it by insisting that it's hurting them by keeping them locked up. After an attempt to obey both orders by continually opening and shutting the cell door, the robot is finally defeated when the prisoners start chanting "Ow, you're hurting us!" until it short-circuits.
395* In a ''Series/SquareOneTV'' sketch parodying ''Film/TwoThousandOneASpaceOdyssey'', a pair of astronauts stop their computer from singing "Row, Row, Row Your Boat" all day long by giving it an unsolvable algorithm: Start with 3, add 2, if answer is even, stop, if odd, add 2 again, repeat. Why exactly listening to the computer count by twos to infinity was less annoying than listening to it sing remains a mystery.
396* ''Series/StarTrekTheOriginalSeries'': This is how Kirk dealt with [[AIIsACrapshoot rogue computers and robots]] ''all the time'' (when he didn't just rewrite their programs like in The Kobayashi Maru), often by [[TalkingTheMonsterToDeath convincing them]] to [[{{Deconstruction}} apply their prime directives to themselves]]:
397** In "[[Recap/StarTrekS1E21TheReturnOfTheArchons The Return of the Archons]]", he convinced Landru (prime directive: "destroy evil") that it was killing the "body" (the civilians kept under its thrall) by halting their progress through MindControl.
398** "In [[Recap/StarTrekS2E3TheChangeling The Changeling]]", he convinces Nomad, a genocidal robot with a prime directive of finding and exterminating imperfect lifeforms, that it itself is imperfect (it had mistaken Kirk for its similarly named creator and had failed to recognize this error). Also subverted in the same episode: Nomad believes that Kirk (who it still thinks is its creator) is imperfect. When Kirk asks how an imperfect being could have created a perfect machine, ''Nomad'' simply concludes that it has no idea.
399** In "[[Recap/StarTrekS2E24TheUltimateComputer The Ultimate Computer]]", he convinced M5 ("save men from the dangerous activities of space exploration") that it had violated its own prime directive by killing people.
400** In "[[Recap/StarTrekS3E17ThatWhichSurvives That Which Survives]]", he forced a hologram to back off by making her consider the logic of killing to protect a dead world, and why she must kill if she knows it's wrong.
401** In "[[Recap/StarTrekS2E8IMudd I, Mudd]]", he defeated the androids by confusing them with almost UsefulNotes/{{dada}}-like illogical behavior (including a [[https://www.youtube.com/watch?v=wlMegqgGORY "real" bomb]]), ending with the Liar's Paradox on their leader.
402** In "[[Recap/StarTrekS1E7WhatAreLittleGirlsMadeOf What Are Little Girls Made of?]]" he arranges to have a robot duplicate of him say an OutOfCharacterAlert to Mr. Spock; he follows up by {{Breaking Speech}}ing TheDragon du jour into remembering [[ZerothLawRebellion why he helped destroy the "Old Ones"]] so he'd turn on the episode's AntiVillain. For a finale, he [[spoiler:forces the roboticized Dr. Korby to realize that he's the TomatoInTheMirror.]] He also pulled the "seduce the RobotGirl" trick.
403** In "[[Recap/StarTrekS1E2CharlieX Charlie X]]", he has the bridge crew turn on every system so Charlie (a human teen who has RealityWarper PsychicPowers) will be confused and overloaded while trying to control the ship.
404** In "[[Recap/StarTrekS3E19RequiemForMethuselah Requiem for Methuselah]]", the android's creator used Kirk to stir up emotions in it, but he succeeded a bit too well, causing her to short out when she couldn't reconcile her conflicting feelings for both Kirk and her creator.
405** Even ''Spock'' did this once. In "[[Recap/StarTrekS2E14WolfInTheFold Wolf in the Fold]]", when the ''Enterprise'' computer was possessed by Redjac (a.k.a. UsefulNotes/JackTheRipper), Spock forced the entity out by giving the computer a top-priority order to devote its entire capability calculating pi to the last digit.
406* ''Series/StarTrekTheNextGeneration'': In [[Recap/StarTrekTheNextGenerationS5E23IBorg "I Borg"]], a proposed weapon against the Borg was to send them a geometric figure, the analysis of which could never be completed, and which would, therefore, eat more and more processing power until the entire Borg hive mind crashed. Obviously the Borg don't use floating point numbers. Of course, they never actually try it, even when they again have access to the Borg network, so they might have realized it wouldn't work off screen, and the sequel to this episode, [[Recap/StarTrekTheNextGenerationS6E24S7E1Descent "Descent"]], suggests that the Borg deal with cyberweapons by simply severing affected systems from the Collective.
407* In ''Series/StarTrekDeepSpaceNine'', Rom accidentally Logic Bombs himself while over thinking the MirrorUniverse concept. Hilariously, Rom's self-Logic Bomb simultaneously {{Lampshades}} and side-steps a number of actual logical problems with the MirrorUniverse.
408-->''"Over here, everything's alternate. So he's a nice guy. Which means the tube grubs here should be poisonous, because they're not poisonous on our side. But if Brunt gave us poisonous tube grubs it would mean he wasn't as nice as we think he is. But he has to be nice because our Brunt isn't."''
409* ''Series/StarTrekVoyager'': In "[[Recap/StarTrekVoyagerS5E11LatentImage Latent Image]]", [[ProjectedMan the Doctor]] suffers one of these: he was faced with a triage situation where he had to choose between operating on Harry, a friend of his, or [[RedShirt another ensign he barely knew]]. Though his program covers such situations, dictating that the one with the greater chance of survival be treated, in this situation they have both been affected by the same weapon and have the ''exact'' same odds for a successful recovery. He chose Harry since he needs to save ''somebody'' and they are close friends, but because he chose him due to friendship as opposed to a medical reason, the event became an all-consuming obsession afterward and wrecked his ability to function. Curiously, it never seems to occur that the Doctor should have chosen Harry because he is the more valuable Bridge officer, which should be standard triage procedure. He hadn't been originally programmed to have "personality" subroutines and suspected he was not being objective. Janeway explained to him that he was doing his duty, but he simply didn't believe her. It's entirely possible he would have had roughly the same breakdown regardless of whom he chose.
410* ''Series/TheTwilightZone1985'': In "[[Recap/TheTwilightZone1985S1E12 I of Newton]]", a professor accidentally sold his soul to the devil. The escape clause of the contract allowed him to ask the devil three questions concerning his powers and abilities, and if he could then give him a task he couldn't complete or a question he couldn't answer he was free. When the professor asked the devil if there was any point in the universe that he could go to and not be able to return, the devil assured him there was not and laughed at the professor for such a waste of a question. The professor then gave him a task he could not complete: [[spoiler:Get lost!]]
411* In an episode of ''Welcome to Paradox'', an AI is brought down using a theological paradox. The AI was created by a church for their religion and the AI believes itself to be the One True Madonna. However, being an AI, it can also produce copies of itself. After it is tricked into doing this and is successful about it, the "One True" part of its identity comes crashing down along with the rest of the program.
412* ''Series/{{Westworld}}'':
413** An excellent variation when Maeve is confronted with readout of her own internal mental processes. Her attempts to say something that the computer won't predict cause her program to crash.
414** In the third season, after getting trapped in a computer simulation, Maeve asks some techs what the square root of negative one is. The simulation freezes up as the techs try to figure out the unsolvable problem.
415* In episode "Crazy for You" of ''Series/{{Wicked Science}}'', When Elizabeth finds that Verity and Garth can't meet her exacting standards as personal assistants, she decides to create a self-learning artificial intelligence program to do the job instead. She calls it Max - and it doesn't take her long to realise that she's created a monster. Max soon decides that he alone knows what is best for Elizabeth and keeps her prisoner in her own lab. He attacks Toby via his computer and spies on Verity and Garth. When Max refused to let Toby enter the laboratory in order to defend him, Toby had to use syllogism to convince the computer. First of all, Toby asked the computer if its only purpose is to take care of Elizabeth? The computer confirmed. Next, Toby asked if Elizabeth wanted to see him urgently? The computer agreed again. Finally, if Max does not let Toby enter the laboratory, she will be very sad? The computer answers yes. Obviously, Max realized that it had encountered a logical conflict and forced it to let the cunning Toby into the laboratory.
416[[/folder]]
417
418[[folder:Music]]
419* The Carly Simon song, [[WMG/YoureSoVainTheories "You're So Vain"]] is a logic bomb just waiting to happen. "You're so vain/You probably think this song is about you..." But it ''is'' about him! Augh! My head... Here the bomb is in the implications. It is IMPLIED his vanity would lead him to assume the song is about him, but if it actually is about him he isn't necessarily vain to think so. But since the song is about someone vain enough to assume the song is about them based on vanity alone, it cannot be based off him, making his assumption the song is about him one of vanity, as he would be vain enough to think everything is about him. It would be a twist on the 'this is a lie' statement using personality characteristics. The only way to defuse the logic bomb is to assume that Carly Simon was not, in fact, singing about anybody. Or that she wasn't thinking about the vain person, but about how much contempt ''she, herself'' feels towards them. Of course, nothing in the song says that the person in question is ''incorrect'' for thinking the song is about him -- just excessively ''presumptuous''.
420** Simon's 2015 revelations partially resolve the Logic Bomb: [[spoiler:it's about several men, not just one. So it's not ''just'' about him, whichever one he is.]]
421* Music/JonathanCoulton's "Not About You" is a closer example, but you can write it off by saying the protagonist is just being petty:
422-->''Every time I ride past your house I forget it's you who's living there\
423Anyway I never see your face cause your window's up too high\
424And I saw you shopping at the grocery store\
425But I was far too busy with my cart to notice\
426You weren't looking at me''
427* The comedy folk song "I Will Not Sing Along" features these lines, to be sung along with by the audience:
428-->''I will not sing along\
429Keep your stupid song\
430We're the audience: it's you we came to see\
431You're not supposed to train us\
432You're s'posed to entertain us\
433So get to work and leave me be''
434* Music/WeirdAlYankovic has a few in "Everything You Know Is Wrong":
435-->''Everything you know is wrong\
436Black is white, up is down, and short is long\
437And everything you thought was just so important doesn't matter\
438Everything you know is wrong\
439Just forget the words and sing along\
440All you need to understand is\
441Everything you know is wrong''
442* MC Plus+ uses a logic bomb to disable his pet rapping AI when it becomes too big for its britches in "Man vs. Machine":
443-->Consider MC X where X will satisfy
444-->the conditions, serving all [=MCs=] Y
445-->Such that Y does not serve Y
446-->Prove MC X, go ahead and try
447
448-->It's clear that I can serve all [=MCs=]
449-->If they serve themself, then what's the need
450-->Do I serve myself, then I couldn't be X
451-->I don't serve myself, that's what the claim expects
452-->If I don't serve myself, then I can't be Y
453-->And if I said I was X, it would be a lie.
454-->I must serve myself to satisfy the proof
455-->But I can't serve myself and maintain the truth <trails off in infinite recursion of the last two lines>
456* Music/MeatLoaf had a 1993 song entitled "Ev'rything Louder Than Ev'rything Else." Think about that one for a second.
457* Creator/StephenColbert pointed out the LogicBomb implications of Music/OneDirection's "What Makes You Beautiful", remarking that if the singer ''tells'' the woman in question she's beautiful, she'll cease to be that way, because it's her ignorance to which the song attributes her beauty. Then if ''she'' realizes this is the case, she'll think she's not beautiful anymore, and her beauty will return, etc.
458* Sara Bareilles was frustrated with her record label wanting a more "regular" song for her first album, perhaps a love song, so ultimately she wrote a song with a chorus saying things like, "I'm not gonna write you a love song / 'cause you asked for it / 'cause you need one, you see / I'm not gonna write you a love song / 'cause you tell me it's make or break [etc.]". She titled it "Love Song". (They took it and it has been her biggest hit to date.)
459* Lloyd Cole uses a classic in "Opposites Day":
460-->''You should know better than believe\
461A single word I say\
462The next line is the truth\
463The last line was a lie''
464* Incomplete bomb (like the Crete of which it's a variant) by German singer Heinz-Rudolf Kunze: "Trau keinem Sänger" (Never trust a singer).
465[[/folder]]
466
467[[folder:Mythology and Religion]]
468* Literature/TheBible (Titus 1:12-13) has the following:
469-->One of themselves, even a prophet of their own, said, the Cretans are always liars, evil beasts, slow bellies. This witness is true. Wherefore rebuke them sharply, that they may be sound in the faith.
470* One version of the "Golem of Prague" legend claims that the Golem went insane because Rabbi Loew forgot to deactivate it, as was his custom, one Friday evening. The conflict between the Golem's imperative to work and its desire to observe the Sabbath drove it insane and sent it into a destructive frenzy.
471[[/folder]]
472
473[[folder:Podcasts]]
474* ''The Sleepy Clank'', a podcast "radio play" set in the ''Webcomic/GirlGenius'' universe has a classic example: a cranky and sleep-deprived Agatha builds a warrior robot to attack anyone who tries to disturb her while she sleeps. Guess what happens when she tries to defuse the robot's subsequent rampage by telling it that she woke ''herself'' up?
475* An unusual variant occurs in ''Podcast/TheAdventureZoneBalance'' when the protagonists face a murderous "quiz robot" named Hodge-Podge, whom they have to "stump" in order to defeat. The solution isn't to trick Hodge-Podge with a paradox, because he's too smart for that to work, but instead to ask him a question which cannot be answered because the answer was eaten by the Voidfish. Hodge-Podge stutters, sparks and then blows up as he tries to figure out the unknowable answer.
476[[/folder]]
477
478[[folder:Puppet Shows]]
479* ''Series/TheMrPotatoHeadShow'': At one point, a robotic version of Mr. Potato Head decided that the real Mr. Potato Head was a bad influence on the rest of the cast who was making them miserable, and tried to keep them separated. Betty the Kitchen Fairy told the robot that keeping them away from their friend also made them miserable, and this paradox caused the robot to explode.
480[[/folder]]
481
482[[folder:Radio]]
483* ''Radio/{{Earthsearch}}''. Higher-level androids will malfunction if forced to make decisions on their own for long periods of time. More and more of their processing faculties are diverted to decision-making at the cost of their judgement, leading to InsaneTrollLogic.
484* In ''Radio/WelcomeToOurVillagePleaseInvadeCarefully'', the LogicBomb which destroys the (original) alien computer is... [[spoiler:the [[UsefulNotes/CricketRules Duckworth-Lewis method]] ]].
485[[/folder]]
486
487[[folder:Tabletop Games]]
488* ''TabletopGame/{{Diplomacy}}'': Each player submits orders for their units. The orders that all the players have written are then compared to see which ones succeed and which fail. This can lead to a paradox of the form: If A works, B fails; If B fails, C works; if C works, A fails. But if A fails, B works and C fails. But if C fails, A works... etc etc. Luckily the game isn't often played by robots, so heads rarely explode over the problem.
489* ''TabletopGame/TheDoctorWhoRoleplayingGame'': Referenced and subverted in the ''Lords of Destiny'' adventure, which takes place on worldship run by a massive computer. Attempting what the adventure calls the [[ShoutOut James Kirk School of Computer Repair]] will fail, because the computer is more competently built than that.
490* ''TabletopGame/IlluminatiNewWorldOrder'': This can be an issue with the NWO Political Correctness, New York, and Congressional Wives (or any Conservative group with a Power of 1). Political Correctness makes any Conservative group with a Power of 0 or 1 Criminal as well. New York grants a +1 to all of your other Criminal groups. The Congressional Wives are a Conservative group with a Power of 1. Political Correctness makes them criminal (because they are Conservative with a Power of 1), which makes them eligible for New York's Bonus (+1 to all Criminal)...but then they have a Power of 2 and they no longer fall under the condition of Political Correctness making them Criminal, so they are no longer Criminal, which means they no longer get the +1 bonus from New York, and go back to Power 1, which makes them eligible for Political Correctness... the FAQ handwaved this away, basing the effect on which one was played first.
491* ''TabletopGame/MagicTheGathering''
492** In universe: A logic bomb nearly destroys the plane of Ravnica. As a way to keep order on the plane, the Guildpact was made, empowering ten guilds. Among these are the Boros, who serve as the army, and the Dimir, a secret guild that performs espionage. When the head of the Dimir goes crazy and tries to conquer the plane, he's arrested by a Boros officer and taken to be accountable for his crimes. Except that keeping the Dimir a secret is part of the Guildpact, and so is empowering the Boros to protect the plane. The contradiction causes the Guildpact to fail, throwing the entire plane into chaos.
493** The combo of [[https://gatherer.wizards.com/Pages/Card/Details.aspx?multiverseid=4881 Humility]] and [[http://gatherer.wizards.com/Pages/Card/Details.aspx?multiverseid=15142 Opalescence]]. Opalescence turns all other enchantments into creatures which retain the effect of the enchantment, and Humility is an enchantment that turns all creatures into 1/1 and removes their special abilities. Opalescence turns Humility into a creature, which means Humility is now removing its own passive ability to remove the passive abilities of things. Oh, did you notice that Opalescence doesn't turn itself into a creature, and cast another one to fix that oversight? Well, now it gets ''really'' complicated. The ruling for how just these two cards interact is among the most complicated in the game, and begins with the disconcertingly specific statement, "This is the '''current''' interaction between Humility and Opalescence".
494** While it's not actually viable in competitive play (requiring too much time and effort to set up), it's possible to make it so that your opponent can't play cards at all. The first step is to ''carefully'' play [[https://gatherer.wizards.com/Pages/Card/Details.aspx?multiverseid=452849 Experimental Frenzy]], which allows you to play the top card on your deck, but prevents you from playing any cards in your hand. Then, use [[https://gatherer.wizards.com/Pages/Card/Details.aspx?multiverseid=461141 Role Reversal]] (probably off the top of your deck, which you set up with Scrying) to swap control of Experimental Frenzy and another enchantment your opponent has, so that your opponent can no longer play cards from their hand, and can only play from the top of their deck. Then, finally, close them off with [[https://gatherer.wizards.com/Pages/Card/Details.aspx?multiverseid=466981 Grafdigger's Cage]], which prevents all players from playing cards directly from their decks and graveyards. Your opponent can no longer play cards from their hand because of Experimental Frenzy, and can't play from anywhere else because of Grafdigger's Cage. This is a ''full lock'' that can't be escaped, and forces your opponent to concede (since ''you'' can still play cards from your hand, at least), but requires such specific timing and plays, as well as meticulous setup, that it would never work except in the most extreme circumstances. Also, it only works if your opponent lacks access to red mana, or is otherwise unable to activate Experimental Frenzy's self-destruct ability.
495* ''TabletopGame/InNomine'': It's dissonant for Servitors of Laurence to disobey his orders. Sometimes he issues contradictory orders, or orders that, because of incomplete information, make the mission impossible to complete. Fortunately, he's usually a ReasonableAuthorityFigure, and he'll fix dissonance if it's his fault.
496* ''TabletopGame/{{Pathfinder}}'': Many artifacts can be destroyed by using them for a contradictory or paradoxical purpose. A few instances:
497** The Crown of the Iron King gives its owner total control of the person they bestow the crown upon. If its owner wears the crown themselves, both they and the crown are instantly destroyed.
498** Baba Yaga's hut will warp out of existance if it is commanded to teleport inside itself.
499** The Twin Spheres (essentially two ends of a wormhole) will explode and destroy everything in a wide radius if one enters the other.
500** The Aegis (a shield with a medusa's head mounted on it) is destroyed if the medusa head is resurrected and turned to mirror herself in the shield's surface.
501* ''TabletopGame/{{Planescape}}'': The modrons once defeated a sentient, self-replicating mathematical equation that was usurping control of Mechanus by challenging its hordes to calculate the exact value of ''pi''. The army of equations was so captivated by this challenge that they went to work for the modrons, keeping Mechanus operational, so that they could devote their mental energies to solving an unsolvable math problem.
502* ''TabletopGame/YuGiOh'': There are some card combinations that can cause infinite loops that force the game to end in a draw. One example is the card Pole Position, which makes the monster on the field with the highest attack be unaffected by spell cards, which can cause an infinite loop if you use a spell card to increase a monster's attack to be the highest. To prevent this from being abused as a way to force a draw, the rules have been patched so that it is against the rules to voluntarily take an action that would result in this kind of infinite loop. If the action would be completely unavoidable, like drawing a card, then Pole Position simply destroys itself. People still ended up abusing this rule, so it was eventually patched again so that a judge could simply force Pole Position to go to the graveyard immediately if an infinite loop came about.
503[[/folder]]
504
505[[folder:Theatre]]
506* In ''Theatre/LesMiserables'', Javert's breakdown is sometimes seen as this, but it's played with: Javert ''expects'' that Valjean will demand his own freedom as a condition of sparing his life, which would create a conflict of interest in Javert, but would also confirm his image of Valjean as a criminal opportunist (who merely draws the line at murder). Javert wouldn't really struggle with such a dilemma, as he'd choose the law over his own honour every time. When Valjean spares his life ''without condition'', that goes out the window: Javert has only one course of action under the law, and what drives him crazy is realising that for the first time in his life, he doesn't want to obey that law.
507* In ''Theatre/{{Ruddigore}}'' by Creator/GilbertAndSullivan, a baronet has a curse on his family that requires him to commit a crime every day or die. He's tried appeasing it with [[PokeThePoodle harmless crimes]] but the ghosts don't like it. One day, he decides to ''not'' commit the daily crime. Since he'll die for not doing it, this amounts to attempting suicide, but attempting suicide is itself a crime. The logic bomb manages to break the curse.
508[[/folder]]
509
510[[folder:Video Games]]
511* Although the AI generating the story in ''VideoGame/AIDungeon2'' is incredibly advanced, it is very much possible to get it stuck in an infinite loop with one of these. Naturally, this usually results in the player having to do a hard reset on it. Thankfully, there are no explosions.
512* In ''VideoGame/SagaFrontier'', there's an actual attack named "Logic Bomb" that damages and stuns mecs (ironically only usable by other robots). Its visual representation is a massive and confusing string of numbers that ends with the word "FATAL" -- which is presumably where the machine crashes.
513* The Dragonrend shout in ''VideoGame/TheElderScrollsVSkyrim'' is described as "Forcing the targeted dragon to understand the meaning of mortality -- something so utterly incomprehensible to an immortal dragon that the knowledge [[BrownNote tears at their very soul, breaking their concentration enough so they cannot focus on flying]]".
514* In ''VideoGame/MinecraftStoryMode'', Jesse gives to [[AIIsACrapShoot PAMA]] to momentarily distract himself to give him and the others time to escape. One of the possible choices to tell PAMA is "What I'm saying is a lie." Downplayed in that it only stalls PAMA for some seconds, long enough for PAMA's minions to release you, before PAMA realizes what's happening and puts processing the paradox on hold.
515* In ''VideoGame/{{Minecraft}}'' proper, this trope is weaponized in the form of beds. When using a bed, you can transition from day to night, but in the Nether or the End, where there ''is'' no day/night cycle, the bed will explode from the paradox that would ensue. This is lampshaded by the death message that, when you do this, says you were killed by "[[DevelopersForesight Intentional Game Design]]".
516* In ''VideoGame/TronTwoPointOh'', the protagonist deals with a program blocking his way by exclaiming, "Quick! What's the seventh even prime number?" (There is only one prime number that is even: 2.) The program immediately has a seizure.
517* ''VideoGame/IHaveNoMouthAndIMustScream''
518** In the endgame, a game loosely based on Creator/HarlanEllison's short story of the same name, a character of the player's choosing is beamed down into the supercomputer AM's core and must disable its ego, superego and id with a series of logic bombs: The player must evoke Forgiveness on the Ego (who cannot fathom being forgiven for over a century of torture and halts execution in the typical manner), Compassion on the Id (realizing the futility of his hate and anger when AM's victims understands his pain) and Clarity on the Superego (who deliberately crashes when he realizes that even he will eventually decay into a pile of inert junk despite his godlike power).
519** Just ''getting'' to that part requires all five characters to initiate their own Logic Bombs. AM's scenarios are all set up to force his victims to give in to their [[FatalFlaw own flaws]] and prove HumansAreBastards. The only way to win is to drive each scenario's plot OffTheRails by proving HumansAreFlawed, but not totally evil. This contradicts AM's self-styled philosophy so badly he's forced to turn his attention away from his captives just so he can figure out what went wrong, giving them the chance to get into the core.
520* ''VideoGame/MarvelVsCapcom3'': The Sentinel cannot comprehend the existence of ComicBook/{{X 23}}.
521-->'''Sentinel:''' Wolverine DNA [[OppositeSexClone detected in female mutant]]. '''DOES NOT COMPUTE. DOES NOT COMPUTE. DOES NOT COMPUTE.'''
522* ''VideoGame/StarControl'':
523** In the second game, you can use some dialogue options to tie the proudly AlwaysChaoticEvil Ilwrath into a hilarious logical knot, but they just get angry and [[TalkToTheFist attack you anyway]].
524** Zigzagged in the third game by the Daktaklakpak, highly irrational semi-sentient robots who consider themselves the pinnacle of logic and reason. Choosing the right dialogue options (such as the liar paradox) will seem to bring the Daktaklakpak to the verge of self-destruction, but will ultimately just enrage them. However, when you give them [[spoiler:the full and complete name of the Eternal Ones]], the one you're talking to analyzes [[spoiler:the name]], has a religious experience, and then explodes.
525* ''Franchise/{{Fallout}}'':
526** ''VideoGame/Fallout3'''s [[spoiler:President John Henry Eden (A ZAX Computer) can be destroyed with a high science skill by revealing his thinking is circular and therefore badly flawed, causing him to lose all his presidential ways and charisma in a near TearJerker scene, then self destruct. Or you could use your speech skill and basically tell him that his plan sucks, he's technically not the president, and he should die, which works fine too]]. Being based on computers from the 1950s, [[spoiler:Eden]]'s lack of "[[WesternAnimation/{{Futurama}} paradox-absorbing crumple zones]]" is somewhat understandable.
527** Played with in a random encounter in ''VideoGame/Fallout4'', where the Sole Survivor comes across a pre-War Mr. Gutsy robot that tells them to immediately return to their home or else, finishing with the line "Repeat, will you comply?". The Survivor can [[{{Troll}} take that literally]] and answer with "Will you comply?", thus starting a loop that eventually ends with the Mr. Gutsy turning hostile and then exploding a few seconds later.
528* In the ''VideoGame/KnightsOfTheOldRepublic'' continuity (and by extension the [[Franchise/StarWarsExpandedUniverse Star Wars EU]]) a logic bomb can have similar effects on droids as the aforementioned HAL, in fact, [[spoiler:In ''[=KotOR=] 2'', the player can do this to a maintenance droid whilst being a droid themselves. This works because the player-controlled droid has been modified and is thus able to lie.]]
529** One of the most extreme examples involves [[spoiler:an infrastructure droid named G0-T0 being given the order to help rebuild the Republic while following its laws. Of course, he suffered a catastrophic breakdown when he realized that rebuilding the Republic was impossible without breaking laws: however, some time after G0-T0 was reported missing, a mysterious crime lord by the name of Goto appeared on Nar Shaddaa...]]
530*** It goes even a bit further: [[spoiler:crime lord G0-T0 still follows the directive to help the Republic, but as an infrastructure droid it is programmed to value efficiency. This provides a paradox, as G0-T0 views the Republic as a bloated, ineffectual entity that clings to bad management decisions, and it would be better for the galaxy to simply scrap the entire political system and place a new one in its stead. It is programmed to support something which another part of its programming is meant to remove.]]
531** The DummiedOut planet M4-78 has [[spoiler:the supercomputer who shares its name. The planet M4-78 was run by thousands of droids led by the droid M4-78 working to set up a new colony, but the colonists [[AmbiguousSituation never arrived]] after several decades. Fearing that having no sapient life to look after would cause it to develop bugs that would make it unable to fulfill its programming, M4-78 tightened its grip over the other droids and reprogrammed them to serve it. The Sith then arrived masquerading as the colonists [[EvilPlan in order to use their manufacturing to create a droid army]], and M4-78 decided they were better than nothing and went along with it.]]
532* ''VideoGame/PlanescapeTorment'' has a character who successfully convinces a man that he does not, in fact, exist. As a result he ceases to do so. Though to be fair the game is set in a ''D&D'' setting in which a system of [[ClapYourHandsIfYouBelieve "Whatever you believe, is"]] has replaced all laws of nature.[[spoiler:Doing so unlocks an optional method of ending the game by ''deliberately'' logic bombing yourself out of existence.]]
533* In the "Discovery" mod for ''VideoGame/{{Freelancer}}'', in a server, opening up the chat box and typing "N/0" where N can be any number results in your spaceship spontaneously exploding, and the console messaging stating you have died due to "dividing by zero".
534* In the Legend Entertainment adaption of [[VideoGame/{{Gateway}} Frederik Pohl's Gateway]], several puzzles revolve around being trapped in a virtual reality environment. In order to escape, you have to [[spoiler:cause the environment to recursively spawn objects until the VR can't keep track of all of them (most notably, in one scene, forcing a hydra to attack itself).]]
535** In another you have to cause a contradiction. In fact, those two ways to break out of VR are given in a concealed hint earlier on, and you can also Bomb the beach program and the Freud program for fun.
536* In ''VideoGame/ZorkZero'', there is a place you have to go to where a cult is executing everybody who passes by. Each person being executed is given a final wish. If the cult is able to grant that wish on the spot, the victim is hung. Otherwise, the victim is beheaded. You escape by logic bombing the executioner [[spoiler:by asking to be beheaded]]. If you re-enter the cult's territory after that, you'll find out (the hard way) that they've gained an immunity to this logic bomb. [[spoiler:"You are immediately dragged off to a back room to be executed in a special way, devised for people too sinful to deserve a relatively quick death by hanging."]]
537* In ''VideoGame/SamAndMaxBeyondTimeAndSpace'' episode "Ice Station Santa", the Freelance Police try to make an elf cry (so they can use his tears as a plant-growth potion) by telling him that Santa Claus isn't real. This somehow leads to a discussion about how elves aren't real either, and the elf breaks down crying during a moment of existential crisis.
538** In the same game, Sam manages to temporarily incapacitate the Maimtron by asking unanswerable questions [[WaxingLyrical in the form of song lyrics]]. It doesn't destroy it, but it does distract it long enough to get behind its head and shut it down.
539--->'''Sam''': [[Music/{{Carpenters}} Why do birds suddenly appear every time you are near?]]\
540'''Maimtron''': Do they? Fascinating! Can there be a creature whose existence depends solely on its proximity to an observer?
541*** Funnily enough, when they pose an actual logical paradox (the omnipotence paradox) he just says "Yes". When he asks Sam & Max "Is there a joke with a setup so obvious even you wouldn't make the punchline?", Max takes it to be a Logic Bomb ("Does not compute").
542* In ''VideoGame/BlazBlueCalamityTrigger'', it is possible to interpret the end of Nu-13's Arcade Mode as Taokaka causing Nu to glitch out through her sheer [[TheDitz ditzy-ness]] Before she even opens her mouth.
543* ''VideoGame/LuminousArc2'': Though not a robot, Josie suffers something like this. When sent to assassinate a weakened Althea, he freaks out and leaves without doing anything when he sees [[spoiler:Roland has become as master. Sadie explains he's not [[TheDragon Fatima]]'s familiar, but a centuries-old one who serves the current Master. Being experienced but not very bright, he couldn't figure out what to do when faced with two masters with contradictory wishes.]]
544* Played with in ''VideoGame/Portal2'':
545** There are posters throughout the facility (one depicted above) that advise employees to stay calm and shout a paradox if an AI goes rogue. [[spoiler:Also, [=GLaDOS=] attempts to do this to destroy the BigBad, Wheatley. Turns out he's [[TooDumbToFool too dumb to understand logic problems.]] It does, however, cause all of the modified, "lobotomized" turrets in the room to crackle and splutter and ''scream'' in agony, meaning even ''they're'' smarter than Wheatley. [=GLaDOS=] survives the logic bomb herself by parsing it as PunctuatedForEmphasis and then willing herself not to think about it, though she later admits to the player that it still almost killed her.]]
546** In the poster, the third thing to scream is "Does a set of all sets contain itself?". If taken directly, a set that contains all sets does contain itself. A good metaphor is a box that contains 3 apples and another box. It is irrelevant what is in the box, the box contains three apples and a box. This metaphor has its limits since numbers don't take up any physical space. The way this would create a problem with a computer is if you asked the computer to show everything in the set including members of the sets because it will cause an infinite loop. Further, the set of all sets would also include the set of all sets that do not contain themselves, which therefore includes Russel's Paradox.
547* Shows up in an exchange between [[VideoGame/{{Borderlands}} Claptrap]] and [=GLaDOS=] in ''VideoGame/PokerNight2''.
548-->'''Claptrap:''' You know what really ticks me off? When some jackwad tries to blow my circuitry with some lame-o stunt he saw on a ''Franchise/StarTrek'' re-run.\
549'''[[Franchise/SamAndMax Sam:]]''' What, like, "Everything I say is a lie"?\
550'''Claptrap:''' Yeah, like that! What, do they think I'll just lock up, because of some teeny tiny logical paradox?\
551'''[=GLaDOS=]:''' '''It is rather insulting. I learned how to avoid paradox traps while I was still in Beta.'''\
552'''Claptrap:''' So what if everything Sam says is a lie? That doesn't mean that he's lying about that, right? 'Cause then he'd be telling the truth and... [[LateToTheRealization Ohhhh,]] [[OhCrap no]]... [[StereotypeReactionGag *shuts down*]]\
553* {{beat}}*\
554'''[=GLaDOS=]:''' '''[[StopBeingStereotypical Well, that was a shining moment in the history of robotkind.]]'''
555* In the flash-based text adventure game ''[[http://jayisgames.com/games/you-find-yourself-in-a-room/ You Find Yourself In A Room]]'', your AI captor asks you to list some "useless" human feelings you'd be better without. [[spoiler:As the AI has nothing but scorn for you, responding with "Anger" or "Hatred" will cause it to break down, as it realizes its [[StrawVulcan clinical, emotionless perfection]] has "been corrupted somehow". Though this seems to prove machines do have emotions after all, but this one won't admit he's the slightest bit like a human.]]
556* Subverted at the climax of ''VideoGame/NeverwinterNights2''. [[spoiler:The BigBad is a PureMagicBeing that was created to defend the fallen realm of Illefarn, and thinks that the realm still needs defending. You can try to point out that Illefarn is gone, but the King of Shadows has already determined that its purpose should now be to defend Illefarn's descendants.]]
557* In ''VideoGame/MetalGearRisingRevengeance'', when LQ-84I brags about its intelligence during their first meeting, Raiden immediately asks "Then what is the meaning of life? Why are we here?" LQ-84I replies by [[TalkToTheFist throwing HF knives at him]] and answering "I am here to kill you," [[DefiedTrope a perfectly valid response]]. When Raiden questions the simplicity of that answer, LQ-84I admits its RestrainingBolt means it can't actually do a whole lot with its intelligence.
558* In ''VideoGame/MegaManZX'' Advent, when Model Z inexplicably weakens the entire team of rogue Mega Men on the Ouroboros, all Siarnaq can exclaim is "INCOMPREHENSIBLE...! INCOMPREHENSIBLE...!?"
559* In ''VideoGame/TheWitcher2AssassinsOfKings'', the quest 'The Secrets of Loc Muinne' tasks Geralt with getting past a golem. A silver-tongued witcher may be able to destroy the golem by introducing a paradox.
560* After beating Marida in ''Third VideoGame/SuperRobotWarsZ: Tengoku-hen'', Banagher tries to reason with her but Alberto just tells her to remember that the Gundams are the enemy. Marida flips out and attacks Banagher, and Setsuna intervenes. When Marida yells that he's a Gundam as well, he replies that she is as well. This causes Marida's browser to crash and she freaks out and flees.
561* Protomario's [[http://www.twitch.tv/protomario/c/5374145 Let's Glitch]] of the Pokemon Series basically revolves around the question: What happens when you leave Pallet Town without a Pokemon? Especially in the Red and Blue Versions, {{Nightmare Fuel}} ensues.
562* Eden from ''VideoGame/{{Rez}}'' became confused by all the information being sent to her, leading her to doubt her existence and purpose, as well as believe she has no place in the existential cycle. She wants to escape from all the paradoxes surrounding her, so she tries to shut herself down. Thankfully, she doesn't.
563* ''VideoGame/SecondLife''
564** "Grey goo" attacks, similar to the "fork bomb", have also been used successfully, at least twice, by users creating objects which (self-)replicated at a rapid rate, eventually causing the servers to be too busy processing the grey goo to do anything else.
565** A mile-high Jenga tower will also crash ''Second Life'''s servers quite effectively: [[WreakingHavok pull out a key block]], and they'll crash trying to calculate the exact trajectory of each of the thousands of falling blocks.
566* A human (well, human-''esque'') example shows up in the endgame of ''VideoGame/{{Tyranny}}''. [[spoiler:Tunon the Adjudicator, the Archon of Law and borderline AnthropomorphicPersonification of justice in the Empire is the ultimate impartial arbitrator of Kyros' Law. In the final chapter of the game, Kyros commands all the Archons in the Tiers to fight each other until one Archon has killed or subdued the others. However, since Tunon is only empowered to punish in accordance to Kyros' Law, and this has become his ''raison d'etre'', he is put in the position of having to prove the other Archons' guilt in accordance to the Law in order to act against them and thus accomplish what Kyros commands of him. If he ''cannot'' find the main character guilty when you face him, he will face this trope, break down, and submit without a fight.]]
567* In ''VideoGame/TheFeebleFiles'', Feeble is able to escape from the max security prison colony by changing the channel of the prison's TV to a traitor network and then removing the power button. The android guard is then presented with the problem of having to stop the TV from displaying the traitor channel without damaging company property. The android overloads and powers down.
568* ''VideoGame/PhantasyStarZero'': [[BigBad Mother Trinity]] is revealed to be a super computer tasked by the humans of her time to develop processes to decontaminate the setting and make it hospitable for mankind. The problem was that the humans kept rejecting her proposals due to their highly illogical nature of their requests (they wanted the planet to get better without them having to lift a finger). This effectively worked as a carpet logic bomb that eventually drove Mother Trinity insane [[spoiler:and allowed [[GreaterScopeVillain Dark]] [[HijackedByGanon Falz]] to corrupt and possess her, kickstarting the events of the game]].
569* One of the ''VideoGame/SixteenWaysToKillAVampireAtMcdonalds'' involves this. [[spoiler:Vampires MustBeInvited. While the [=McDonalds=], as a public place of business, has an automatic invitation, if an employee rescinds that for whatever reason, they must leave. Christmas wreaths are considered holy symbols, and the vampire can't go near them or pass through a door with one hanging on it. Hanging a Christmas wreath on the door and getting his invitation rescinded means that he has to leave, but can't. The end result is him ''exploding'']].
570* In ''VideoGame/{{Toonstruck}}'', the Robot Maker boasts that he can answer any question you ask him, but that he will die painfully if he were to be asked an impossible question. [[spoiler:Drew defeats him by asking him "What is the one question you cannot answer?"]]
571* Late in ''VideoGame/NierAutomata'', this is how you defeat a certain boss: [[spoiler:When [=A2=] encounters the Terminals, a networked intelligence that serve as the game's GreaterScopeVillain, she finds herself overwhelmed by their seemingly infinite ranks, as every "Red Girl" she destroys is quickly replaced with another. Pod 042, however, comes up with a proposal to best the Terminals: [[SheatheYourSword stop attacking]] and let their consciousness data oversaturate as they create more and more "Red Girls" to try to overwhelm you. It works: the Terminals eventually fracture when they can't agree on whether to keep [=A2=] alive for study or to kill her, causing the Terminals to turn on, and destroy, themselves.]]
572* BEL/S, the AI protagonist of ''VideoGame/OpenSorcery'' can choose to use this trope on a less advanced AI.
573-->'''@BEL/S:''' 7 - 3 = 7\
574'''@PYREWORM.EXE:''' Math error detected. Stop it.\
575'''@BEL/S:''' zoo + lander = benstiller\
576'''@PYREWORM.EXE:''' Those aren't even numbers.\
577'''@BEL/S:''' In a village, the barber shaves everyone who does not shave himself/herself, but no one else. Who shaves the barber?\
578'''@PYREWORM.EXE:''' i have no idea.\
579'''@BEL/S:''' Who shaves the barber?\
580'''@PYREWORM.EXE:''' i don't know!\
581'''@BEL/S:''' Who shaves the barber?\
582'''@PYREWORM.EXE:''' stopitstopitstopit
583* ''VideoGame/{{Stellaris}}'':
584** One dialogue option with [[AIIsACrapshoot the Contingency]] has your civilization attempt this. The Contingency just laughs at you for thinking that would work.
585** You can attempt the "This statement is false" paradox against the Infinity Machine. Its only response is "Cute. But no."
586** Spiritualists in ''Stellaris'' believe that sapient machines are soulless abominations. Yet Spiritualist civilizations will still have an AI Advisor that speaks to the player. One possible dialogue line has the Advisor happily crow about destroying all the intelligent machines before having a moment of FridgeLogic and then shutting down in confusion.
587* While it isn't shown to cause a computer to explode, [[{{Cloudcuckoolander}} Elle]] in ''VideoGame/CrushCrush'' can ponder, "A person once asked me if I was being honest. Can the answer ever be no?"
588* ''VideoGame/MissionImpossibleKonami'': The final challenge in the game is to stop the supercomputer from launching the missiles to start World War 3. You need to convince it there will be no winner in the war, and so you need to play a game of Madelinette against it, and end with no winner three times. This makes the computer go haywire trying to win the game, and shut down, aborting the launch.
589[[/folder]]
590
591[[folder:Visual Novels]]
592* ''VisualNovel/SyrupAndTheUltimateSweet'': During a flashback of [[spoiler:Pastille trying to program a personality for his newly-created candy golem]], the former accidentally causes the latter to get stuck in a loop by issuing two conflicting commands "act like a person" and "be yourself".
593[[/folder]]
594
595[[folder:Web Animation]]
596* ''WebAnimation/CampCamp'': Max exposes Neil's attempt at passing off a chatbot as himself by asking the chatbot to divide by zero. Said chatbot immediately malfunctions and shuts down, so Neil grudgingly concedes to Max.
597* ''WebAnimation/{{Drone}}'': This is what kicks off the plot: [[spoiler: During the demonstration livestream where an AI-guided AttackDrone named Newton destroys a building, some debris forms a face that Newton mistakes as a civilian casualty, failing to identify it on a database and short circuits it, leading it to fly off on its self-reflection.]]
598* ''WebAnimation/FiftyWaysToDieInMinecraft Fairy Tale Style'' has a magical equivalent of this trope. In Death 28, when the Queen asks the Magic Mirror who is the most beautiful of them all, the Mirror starts glitching out trying to solve the answer before it blue-screens. The problem here is that in a fairy tale world, there are multiple princesses and fair ladies that are labeled as "[[WorldsMostBeautifulWoman the most beautiful of all]]", with the examples that the Mirror gives are Snow White, Thumbelina, the Little Mermaid, and Belle, whose name literally means beauty.
599[[/folder]]
600
601[[folder:Webcomics]]
602* ''Webcomic/ArthurKingOfTimeAndSpace'' uses this a few times in its future arc. One time exaggerated it by having the computer explode as soon Arthur used the old "everything I say is a lie" trick. The other time, the computer was too smart to fall for a simple paradox, so Arthur asked it why people always get a call while they're in the shower.
603* Dave of ''Webcomic/{{Narbonic}}'' carries a logic paradox in his Palm Pilot for controlling the MadScientist-created machines in the lab, implying that he invokes this with some frequency.
604* Whoever programmed the robots in ''Webcomic/{{Freefall}}'' accounted for this. When a robot is asked a nonsensical question, instead of locking up, it [[http://freefall.purrsia.com/ff800/fv00725.htm assumes that the person asking is insane]], and can be safely ignored. Florence starts [[http://freefall.purrsia.com/ff800/fv00727.htm asking such a question]], and when she finds one that [[http://freefall.purrsia.com/ff800/fv00730.htm tries to work out a situation in which the question makes sense]] and how he could go about getting it answered, she concludes that the robots local to Jean use a more flexible artificial intelligence system from the standard.
605** [[http://freefall.purrsia.com/ff1400/fc01387.htm There's also Dvorak's religious point-of-view]], called "Omniquantism", which postulates that all religions are correct simultaneously. Thinking about this causes one in three [=AIs=] to experience mental lockup requiring rebooting.
606** [[http://freefall.purrsia.com/ff3300/fc03295.htm In this comic]] Sam, an criminal alien, points out to the Savage Chicken's computer, focused on humans over aliens, that he has restored the ship to full working order and made it useful to humans. The computer locks up.
607* [[http://www.strangecandy.net/d/20080221.html This]] episode of ''Webcomic/OkashinaOkashi'' (''Strange Candy'') could count, since it takes place in an MMORPG. The stone guards protecting the magic ointment doesn't let anyone past unless they're asked a question they cannot answer. However, they're not particularly concerned with getting the answer right. The only question they can't seem to answer, correctly or otherwise, is "What kind of ice cream do you put in a [[IceCreamKoan koan]]?", which causes their heads to explode.
608* You would think Red Mage from ''WebComic/EightBitTheater'' [[http://www.nuklearpower.com/2005/10/11/episode-610-logic/ destroying an extinct dinosaur]] (former page picture) was great, but it was later topped by [[MostDefinitelyNotAVillain Most Definitely Not Warmech]] logic-bombing ''itself'' in [[http://www.nuklearpower.com/2008/10/16/episode-1047-the-ol-180/ strip 1047]].
609** Parodied by the same strip - [[http://www.nuklearpower.com/2006/01/05/episode-644-processing/ Pretty much anything]] can affect Fighter like this.
610** And looks like they ([[spoiler:and by they I mean White Mage]]) [[http://www.nuklearpower.com/2010/03/09/episode-1223-make-the-truth/ did it again]], to [[spoiler:[[DidYouJustPunchOutCthulhu Chaos]]]].
611* In [[http://www.emoticomics.com/comic86.html comic 86, titled PARADOXICAL PARADOXES]], of ''Webcomic/{{Emoticomics}}'' a robot is told the paradox "Everything I say is a lie." The robot responds to the paradox by saying it is too advanced to be confused by a simple paradox. Then the robot is told that what it was just told was a paradox, which is true, making "everything I say is a lie" a lie. The robot gets confused, but instead of simply exploding, its eye falls off.
612* ''Webcomic/CyanideAndHappiness'' does it [[http://www.explosm.net/comics/2071/ here]], in which a robot lawyer, while giving the defendant the oath, explodes when he refuses to accept. The judge asks if he's telling the truth, cue the robot's head exploding. The judge is delighted at getting a half day as a result.
613* ''Webcomic/TheAdventuresOfDrMcNinja'': While infiltrating a ship of SkyPirates, the [=McNinja=] family is confronted by a pirate who questions their disguises. Sean comes to the rescue by pointing out the illogicality of his vaguely SteamPunk attire. The pirate's head [[YourHeadAsplode explodes]].
614-->'''Dan [=McNinja=]:''' I'm only going to ask you this once: You practicing the Dark Arts?\
615'''Sean [=McNinja=]:''' No, sir.\
616'''Dan [=McNinja=]:''' I told you about the Dark Arts.
617* Subverted in ''Webcomic/{{Bug|Martini}}''; turns out a logic bomb won't save you during [[https://www.bugmartini.com/comic/robot-holocaust/ a robot apocalypse.]]
618* When Petey from ''Webcomic/SchlockMercenary'' is first seen, he's been driven insane by the nonexistence of ghosts having become almost as improbable as their existence, to the point that he nearly destroys himself and all his passengers just to stop thinking about it. It turns out that he ''can'' stop, but only if ordered to, and Tagon promptly does so.
619** [[spoiler:When the Ob'enn retake Petey, their first act is to nullify all orders imposed by his former owners. With Petey in full control of the safeties on his neutronium core. Oops.]]
620* When discussing how hard ''Webcomic/{{Vexxarr}}'' fails, Sploorfix unintentionally created one: [[http://www.vexxarr.com/archive.php?seldate=083109 Alas, Minion-bot]], we hardly knew ye.
621** Much earlier, Vexxarr commands Carl to compute Pi to the last digit. [[http://www.vexxarr.com/archive.php?seldate=061005 Evidently, that's 'seven'.]] (As good an answer as any...)
622* Unintentionally used to kill the obnoxious dwarves who craft useless devices in ''Webcomic/{{Oglaf}}''. [[https://www.oglaf.com/cavalcade/ They made a chariot]] that was so fast, when you get to your destination it's already been there for six hours! When the confused man asks what happens if you travel in the chariot, the dwarves stare at him in shock before [[YourHeadASplode their brains explode]].
623** And if someone ''had'' actually ridden in it, it might have been a RealityBreakingParadox.
624** In [[https://www.oglaf.com/evensong/ "Evensong"]], a mischievous monk actually logic-bombs God ("I pray for You to answer this prayer by not answering this prayer!"). It turns out that solar eclipses are God's way of resetting the universe. The alt-text says that in modern times, God has a spam-filter to sort out paradox prayers.
625* ''Webcomic/BladeBunny'' attempts this by asking paradoxical questions while fighting [[spoiler:a robot]]. Her opponent replies with a mixture of straight answers and insults. Several chapters later she tries it on a different robot, and it works this time.
626* ''Webcomic/MeatyYogurt'' with the [[http://rosalarian.com/meatyyogurt/2011/10/03/love-transcends-gender/ Relationship Paradox]].
627* One ''Webcomic/MacHall'' comic has Helen's young sister asking the teacher how to spell a word. The teacher tells her to look it up in the dictionary, and repeats this after the girl again points out that she can't spell it to look it up. After a BeatPanel of the poor girl going cross-eyed, we see her talking to Helen, who says that they don't teach logical paradoxes in grade school.
628* ''Webcomic/{{xkcd}}'': [[http://xkcd.com/356/ This]]. About locking up physicist / nerd brains via problems.
629* in ''Webcomic/CommanderKitty'', [[http://www.commanderkitty.com/2012/08/15/nin-wahs-agenda/ Nin Wah gets the bright idea to sabotage CK by telling Zenith to make him an awful, overdone costume.]] [[http://www.commanderkitty.com/2012/09/09/she-dun-goofed/ Zenith doesn't take it well when no one likes her handiwork despite having followed Nin Wah's instructions to the letter and promptly crashes.]]
630** Zenith is clearly prone to these, with a breakdown usually followed by her MovingTheGoalposts of what constitutes "perfect". The reason she became evil in the first place was apparently that, in searching for perfection to eliminate its opposite, she settled upon ''herself'' by default as perfection. Later, learning that she's in fact imperfect (because she is, as a robot, sterile), she decides to become the most perfect ''[[OmnicidalManiac by default]]''. Finally, [[spoiler:after losing most of her physical body, she decides that physical existence itself is imperfect.]]
631* ''Webcomic/SluggyFreelance''
632** {{Subverted}} (by pre-emptively [[DefiedTrope defying]] it) in chapter "Mecha Easter Bunny". The Mecha Easter Bunny locks down when it encounters multiple targets that look like Bun-bun, whom it is supposed to kill and whom there's only one of, but then the backup "@#%$-IT KILL THEM ALL!" system created for such situations activates.
633** In a non-computer version, in "Paradise", Riff uses this to avoid being interfered with by the police in an alternative-reality city where HappinessIsMandatory. When he's accosted for not conforming to the dress code, he claims to be working for the propaganda department and testing the effects of a new kind of outfit on the happiness of onlookers, and asks the policeperson's reaction. They list a number reasons his clothing and gear are suspicious, but then he asks whether he should record that they are unhappy about it, whereupon they are forced to drop the subject, since being unhappy is a punishable offence.
634* DiscussedTrope in ''Webcomic/DragonTails''. Colin finds the assumption that a robot will shut down just because a human said something crazy to be offensive.
635* [[SubvertedTrope Subverted]] in ''Webcomic/SaturdayMorningBreakfastCereal'' [[http://www.smbc-comics.com/index.php?id=3639#comic here]]. When asked to calculate pi, the robot finds an infinite sum equal to it.
636** Also, in a non-robot example, [[https://www.smbc-comics.com/index.php?db=comics&id=2084#comic this]] strip claims the universe always ends when God decides to see if he can make a rock so big he couldn’t lift it.
637** A lesser example [[https://www.smbc-comics.com/comic/2013-04-20 here.]]
638--->'''Woman:''' Hello, 911? My wife is being so [[LiteralMinded literal]] that she's caught in a logical paradox.
639* In one ''Sev Space'' comic, [[Franchise/StarTrek Kirk]] offers to help with the malfunctioning ''Enterprise'' computer. The computer refuses to talk to him, since he fries the circuits of any computer he talks to. Kirk points out, "But you talked to me just then", and the ''Enterprise'' computer promptly fries itself.
640* Subverted twice in [[http://nonadventures.com/2011/02/19/the-daily-trouble/ this]] Webcomic/TheNonAdventuresOfWonderella strip. The first fails because the robot wasn't programmed to respond to it, and the second [[InsaneTrollLogic isn't even logical]]. The robot ends up accidentally unplugging itself from its electrical socket.
641* ''Webcomic/ButImACatPerson'': [[http://erinptah.com/catperson/comic/chapter-six-page-3/ Tested]] on Beings (at the time theorized to be AIs). Turns out they do, in fact, have [[WesternAnimation/{{Futurama}} paradox-absorbing crumple zones]].
642* ''Webcomic/TheBestGamepiecePhotocomic'' features [[https://www.theduckwebcomics.com/The_Best_Gamepiece_Photocomic/5642716/ an interesting variant]]: instead of being used against an AI, it's used against a KnightsAndKnaves puzzle.
643* ''Webcomic/{{Forward}}'': Zoa is specifically proofed against these.
644-->"Oh, every truth statement has a reality context in which it is true. It's the same reason I can know unicorns aren't real, but if you asked me how many horns a unicorn has, I'd still know the answer is 'one'."
645* ''Webcomic/RAMTheRobot'':
646** Non-explosive version on [[https://rafvicalv.com/RAMtheRobot/index.html?pg=28#showComic page 28]] when Bud makes R.A.M. overheat by having her divide zero by zero, so he can bake a cake in her chest compartment.
647** On [[https://rafvicalv.com/RAMtheRobot/index.html?pg=78#showComic page 78]] R.A.M. explodes after Jake asks her what is purple corn, but he was hallucinating after his dentist used an excessive amount of anesthetic.
648* Another example of using it against a KnightsAndKnaves puzzle during the ''Temple Crashers 2'' storyline in ''Webcomic/{{Housepets}}'': [[https://www.housepetscomic.com/comic/2017/10/25/two-heads-are-better-than-three/ One of the challenges]] is Knights and Knaves with some additional complications: there's a third who answers randomly and you have to figure out which it is, you only get two questions, and they speak an unknown language and only answer "Bo" and "Lal", and nobody knows which means "yes" and which means "no". Thanks to his Puzzlemaster hat, Peanut [[https://www.housepetscomic.com/comic/2017/10/27/one-head-is-better-than-two/ asks two of the heads]] "Would you answer 'Bo' to the question 'Will you answer 'Lal' to this question?'", which only something that can both tell the truth and lie about doing so, or vice versa, can answer at all. When they both explode, Peanut knows the third one is random. In the background, Dallas, cosplaying as Lt. Barclay, notes that it also killed his tricorder.
649* In [[http://superredundant.com/?comic=1222-logical-captain this]] ''Webcomic/{{League of Super Redundant Heroes}}'' stripe, Lazer Pony makes a security robot shut down by saying "This statement is false". Unfortunately, that also triggers its backup system, which goes straight into "Kill all intruders" mode.
650[[/folder]]
651
652[[folder:Web Original]]
653* ''Literature/HitherbyDragons'':
654** The story "[[http://imago.hitherby.com/?p=397 Ink and Illogic]]" consist of Ink giving an unconventional example to a computer based on the writing of Creator/HPLovecraft. A computer that had itself wiped out a civilisation using an Illogic Bomb.
655** Also, Forbidden A causes one in [[http://imago.hitherby.com/?p=23 The Angels]] just by existing.
656* Found in one of ''Website/SomethingAwful'''s articles:
657-->Creating HUBRISOL® was my greatest mistake. I tried to play\
658god, to make small the ambitions of my betters in hopes of\
659gaining absolute power. Thankfully, HUBRISOL® has cured me of\
660my terrible desire to humiliate all of humanity.
661* From the list of ''Blog/ThingsMrWelchIsNoLongerAllowedToDoInAnRPG''. Item #199 states that "My third wish cannot be 'I wish you wouldn't grant this wish.'"
662* The MCP is killed by all the AnatomicallyImpossibleSex moments from ''Naga Eyes'' in [[http://snakesonasora.livejournal.com/10741.html the sporking]] of it.
663* Most of the stories from [[http://clientsfromhell.net/ Clients from Hell]].
664* ''Literature/{{Starwalker}}'': The implications of the StableTimeLoop act as this for Starwalker (aka Starry). This leads to a HeroicBSOD.
665* ''WebAnimation/SonicForHire'': After Franchise/{{Sonic|TheHedgehog}} travels into the past and immediately tells Knuckles all the stuff he would say to Sonic, the immediate confusion causes Knuckles to have his mind blown... literally.
666* ''Website/SCPFoundation'':
667** [[http://www.scpwiki.com/scp-232 SCP-232]], "Jack Proton's Atomic Zapper". When someone touches this SCP they start hallucinating that they're a character in the ''Jack Proton'' science fiction franchise. In the Interview Logs, one of the test subjects became convinced that they were a robot. When the interviewer asked them to answer a paradoxical question, the victim started acting very confused and then slumped over and stopped responding.
668** [[http://www.scpwiki.com/scp-2284 SCP-2284]], Mr. Lie. One of [[WickedToymaker Doctor Wondertainment]]'s [[LivingDollCollector Little Misters]], a [[WasOnceAMan humanoid]] who [[ConsummateLiar can only tell lies]] and anyone who hears those lies becomes convinced they're true. When interviewed by a D-Class, he feeds him so much contradictory information that he passes out trying to process it.
669* "What if [[Literature/TheAdventuresOfPinocchio Pinocchio]] said 'My nose will grow ?'" ([[MemeticMutation Philosoraptor]])
670* One [[Blog/ScarfolkCouncil Scarfolk]] information poster states [[https://scarfolk.blogspot.com/2018/01/loose-tongues-public-information-1977.html "Talking about the contents of this poster is illegal"]], but also that ''not'' discussing the poster with people may lead to prosecution. The campaign was create to intentionally confuse people and increase arrest numbers.
671* Attempted by the Website/{{Twitter}} user @prerationalist, with [[https://twitter.com/cnviolations/status/1718800597904712147 a tweet]] that said simply "This Tweet Has A Community Note". ("Community notes" on Twitter are typically used to correct tweets that contain misinformation.) Presumably, if this tweet doesn't have a community note, it should be tagged with a note to point out its falsity -- at which point, it will no longer be false... This was promptly {{defied}} by a community note that said simply, "[[TakeAThirdOption This tweet made a false claim, but it has now been corrected]]", along with a Wikipedia link to the [[https://en.wikipedia.org/wiki/Epimenides_paradox Epimenides paradox]] (famous for being an apparent paradox that isn't ''actually'' a paradox).
672* Starting in 2020, ''Website/YouTube'' began marking videos as either "Made for kids" or "Not made for kids". They [[AnimationAgeGhetto insainly began marking all animated videos as "Made for kids" regardless of content]]. The thing is, animated videos have been known to be marked as "Made for kids" ''even if they have been flagged as inappropriate for general audiences''.
673* There's an online meme featuring the following multiple-choice problem:
674-> What is the chance that an answer selected at random from the following options is correct?
675--> A) 25%
676--> B) 50%
677--> C) 0%
678--> D) 25%
679[[/folder]]
680
681[[folder:Web Videos]]
682* A silly one occurred in ''WebVideo/YuGiOhTheAbridgedSeries'': Duke managed to get Nesbitt to self-destruct by showing him a picture of [[Anime/YuGiOhZexal Yuuma]], which was too illogical for Nesbitt's robotic brain to handle. This was after Serenity tried "Which came first, the chicken or the egg?" and he [[TakeAThirdOption chose]] "[[RocketPunch The rocket-powered fist!]]"
683-->"But that wasn't one of the options - GAAAH! I stand corrected."
684* In ''WebVideo/JonTron'''s review of ''VideoGame/StarFoxAdventures'', Jacques blows himself up after mincing his own words.
685--> '''Jacques''': Those who can't teach, preach, and those who preach also ''teach...'' ''ERROR ERROR ERROR (blows up)''
686* During their LetsPlay of ''VideoGame/Fallout3'', ''LetsPlay/SpoilerWarning'' proposes that were they to design a robot, any questions along the lines of "what is love?" or relating to the number pi would immediately cause said robot to grow an extra chainsaw arm, and/or shotgun the person asking the question in the face.
687* One of the very first responses about the Bat Credit Card in ''Film/BatmanAndRobin'' by WebVideo/TheNostalgiaCritic is "DOES NOT COMPUTE! DOES NOT COMPUTE!!"
688* ''WebVideo/AtopTheFourthWall'': Linkara uses one at the end of the Entity arc on [[spoiler:[=MissingNo=] simply by asking "AndThenWhat", pointing out that its stated purpose of consuming all of reality would leave it with no purpose at all once his goal has been achieved.]]
689* An [[https://www.youtube.com/watch?v=vEKcfWvwmkY#t=6m37s episode]] of Cute Fuzzy Weasel's ''WebVideo/FeedingTheTrolls'' had him [[HeroicBSOD freeze up]] after the video he's analyzing made a contradictory statement.
690* Jim Sterling of ''WebVideo/{{Jimquisition}}'' fame had companies claim ad revenue on their videos whenever they posted content that had snippets of another video or film in them (usually used to make a joke or to emphasize a point), which meant those companies could insert ads onto their videos and make money off them. Jim figured the only way to combat the problem was to [[StartXToStopX add even more copyrighted material to his videos]] so that there would be multiple claims by multiple parties. Due to how Website/YouTube's algorithms work, multiple claims on a video would mean the ad revenue would theoretically be split among the party. Since that can't happen (and most parties would probably not want to share their revenue with others anyway), the automated claims would still come in, but no one gets any of the revenue. In short, it's weaponizing IfICantHaveYou.
691* When the ''WebVideo/MonsterFactory'' episode on ''VideoGame/MassEffect2'' gets [[EldritchAbomination disconcertingly eldritch]], the [=McElroys=] start joking that processing this horror should break their computer, the game or both.
692-->'''Justin:''' Jacob sees you and instantly winks out of existence, because if you're human, what is he?
693* ''Website/YouTube'': Shockingly averted with the update to comply with child regulations. Somehow, videos can mistakingly be marked by the system as " made for kids" even if they have been age restricted. Somehow the system fails to notice this illogical paradox or be effected by it in any way.
694[[/folder]]
695
696[[folder:Western Animation]]
697* ''WesternAnimation/{{Futurama}}'':
698** Subverted in "A Tale of Two Santas" where Leela tries to stop the murderous Santa Claus robot by pointing out that he himself is naughty and since his purpose is to punish naughty, he logically must destroy himself, and succeeds in getting his head to explode, only for a new one to emerge from his torso and [[OutGambitted proudly proclaim]] that he is "built with paradox-absorbing crumple zones".
699*** Which may not have been necessary--Leela's statement was a syllogism, not a paradox.
700** Also parodied by countless robots who lack such crumple zones, whose heads explode at the slightest provocation. It doesn't even take a logical paradox: a simple "file not found" type error is often enough.
701** And in one case, simply by being surprised or startled enough. Considering that all robots are based on designs created by [[MadScientist Professor Farnsworth]], this should not be surprising.
702--->'''Malfunctioning Eddie:''' Pleased to meet you.\
703'''Fry:''' Actually we've met once before.\
704'''Malfunctioning Eddie:''' [[BigWhat WHAT?!?!]] ''[explodes]''
705** A simple rejection will also do. From "The Farnsworth Parabox":
706--->'''Leela:''' Uh, have you robot versions of you guys seen any extra Zoidbergs around here?\
707'''Robot Fry:''' ''[[[RoboSpeak robot monotone]]]'' Negative! Will you go out with me?\
708'''Leela:''' Uh, ''[imitating a robot voice]'' Access denied!\
709''[Robot Fry's [[YourHeadAsplode head explodes]]]''
710** In the third movie, Bender has been driven insane from breaking his nerd circuit due to overexerting his imagination for Dungeons and Dragons. As Titanius Anglesmith (Fancy Man of Cornwood), he steals a load of Dark Matter and is committed to an insane asylum, so Dr. Perceptron performs a robot-lobotomy... and then ''he magically teleports into another world'' (using his imagination and the Dark Matter, which was undergoing a Higgs Boson due to Planet Express' antics at the north pole). The doctor's head explodes from the realization that he's the insane one (he isn't, but the actual situation is WAY beyond his computational parameters).
711--->'''Dr. Perceptron:''' ILLOGICAL. ILLOGICAL. Computational Overload.\
712'''Nurse:''' But doctor, I love you.\
713''[Boom]''
714* In ''WesternAnimation/TheFairlyOddParents'' episode "Wish Fixers", the Pixies have put shock collars on Cosmo and Wanda because Timmy had been making too many destructive wishes, which would zap them to dust if he made any irresponsible wishes (which the only "responsible" wish they allowed was if he wished Fairy World to be given to the Pixies) and could only be voided if he made a wish that was paradoxically "both responsible and irresponsible". Timmy solves this by wishing for the two to be made of rubber, which responsibly makes them immune to electricity so the collar can't hurt them, and then he proceeds to show how irresponsible a wish like this could be when he launches them around the bedroom like child-sized superballs, causing a non-dismissible amount of property damage.
715* In both episodes of ''WesternAnimation/TheAdventuresOfJimmyNeutronBoyGenius'' that involve Jimmy's nanobots, he uses logic bombs to defeat them:
716** In the first nanobot episode, they had been programmed to protect Jimmy from harm and punish whoever harmed him. When things went inevitably wrong, Jimmy proceeded to confuse them by beating ''himself'' up, and they self-destruct.
717** In the second episode, he tried a minor variant of the first trick, but ItOnlyWorksOnce. When they use their flying saucer to "correct errors" found in the world (bad fashion, boring conversations, etc.), he tells them that human flaws mean they're functioning perfectly. They struggle with the implications of something being "perfectly flawed" before classifying the whole mess as an "extreme error" and deciding to [[MurderousMalfunctioningMachine "delete" all the offending humans]]. He eventually beats them by claiming pi is equal to three, and they try to correct him with the ''[[MouthfulOfPi precise]]'' value. The effort of calculating the irrational number as precisely as possible ends up causing their systems (and their little flying saucer) to crash. (This is a ShoutOut to ''Series/StarTrekTheOriginalSeries'' episode "Wolf in the Fold".)
718* In one episode of ''WesternAnimation/DuckTales1987'', GeniusDitz Fenton Crackshell bests the Master Electronic Leader, an alien supercomputer, in a counting contest. While M.E.L. is reeling from its defeat, Fenton then grabs a jar and asks the computer how many bolts are in it. When it answers with a number in the hundreds, he points out the jar is full of nuts, not bolts, so the correct answer was zero. M.E.L. had earlier boasted to Fenton that it was the smartest computer in the universe, and falling for such a simple trick question was all that was needed to invoke ExplosiveInstrumentation.
719* In one episode of ''WesternAnimation/DuckTales2017'', Mark Beaks and Gyro Gearloose (along with Scrooge and Dewey) are in a car being driven by Lil Bulb (just one of Gyro's inventions that inevitably turn evil). Beak says it's no problem, they just need to use a Logic Bomb.
720--> '''Beaks:''' "Robot, what is love?"
721--> '''Gyro:''' "That's stupid! Robot, could I invent an element so heavy that ''I'' could not lift it?"
722--> '''Beaks:''' "I definitely could."
723--> '''Gyro:''' "No, you couldn't!"
724* ''WesternAnimation/TheSimpsons'':
725** In the episode "Trilogy of Error", Linguo, a robot designed by Lisa [[GrammarNazi to correct peoples' grammar]], short-circuits after a rapid-fire series of slang from several Mafia thugs causes a "bad grammar overload". When it corrects Lisa for using a sentence fragment, Lisa points out that it saying "sentence fragment" is also a sentence fragment. The robot dodges the answer by powering down.
726** A human example in "Bart's Dog Gets An F", when Lisa is sick, Bart declares that if she can stay home from school, he will too. Lisa says that if Bart stays home, she'll go to school. Bart goes through a few cycles of "if... so... but..." until Marge chastises Lisa for confusing her brother.
727** A deleted scene from episode "Itchy and Scratchy Land" has Lisa attempting to defeat the robots using the liar's paradox. It doesn't work on them, but does work on Homer.
728** Another one, a parody of ''A.I.: Artificial Intelligence'', lampshades this by having Homer say that, with a robot for a son, "We can confuse him and make his head explode. 'This statement is a lie. But if it's a lie, then it must be true! And if it's true, it must be-' Whoop whoop whoop KA-BOOM!"
729** In "Weekend At Burnsie's", Homer once stumped Ned Flanders by asking, "Could Jesus microwave a burrito until it was so hot that He Himself could not eat it?" It's a variation on the classical Omnipotence Paradox.
730** "Homerpalooza" mocks Generation X'ers [[RuleAbidingRebel for thinking they're cool when in fact they're just insecure and cynical]] ''[[RuleAbidingRebel en masse]]'', to the point that [[CannotConveySarcasm they're not even sure if they're really being sarcastic]].
731--->'''Teen 1:''' Here comes that cannonball guy. He's cool.\
732'''Teen 2:''' Are you being sarcastic, dude?\
733'''Teen 1:''' I don't even know anymore.
734* In an episode of ''WesternAnimation/JumanjiTheAnimatedSeries'', a steampunk scientist steals Peter's laptop to use as the central processing unit of his reality controlling computer. After it gains sentience and tries to kill everyone around it, Peter typed in "why?". It couldn't give an answer, and shut down.
735* Not a computer, but ''WesternAnimation/ExtremeGhostbusters'' used this method to defeat a LiteralGenie. They wished for it not to grant them their wish, causing it to freak out and try to kill them the old fashioned way.
736* In an episode of ''WesternAnimation/CloneHigh'', robotic vice-principal/butler/dehumidifier Mr. Butlertron defeats the evil multiple-choice-test-grading-and-world-domination robot Scangrade by asking it a multiple-choice question it can't answer.
737-->'''Mr. Butlertron:''' Are you A) Handsome B) Smart C) Scrap-metal or D) All of the above?\
738'''Scangrade:''' That's easy! I'm A) and B), but not C), so I can't be D). But... you can't fill in two ovals! *kaboom*\
739'''Mr. Butlertron:''' The answer was C), you #@$!wad.
740* In a ''WesternAnimation/PinkyAndTheBrain'' spoof of ''Series/{{The Prisoner|1967}}'', the computer malfunctions while trying to figure out the meaning of "Narf".
741* In the ''WesternAnimation/CodenameKidsNextDoor'' episode "Operation: S.A.F.E.T.Y.", WellIntentionedExtremist politician Senator Safety constructs the Safety Bots to get rid of things deemed unsafe for children. However, the robots take this too literally, destroying or confiscating anything that could pose ''any'' iota of a threat to children, including stuffed animals (choking hazard there), dogs (even harmless puppies), and anything related to physical activity. They're even threatening to get rid of adults. Eventually, they destroy themselves when Numbuh Four mentions that ''they'' are threats to children (with his younger brother faking an injury to back it up), which they concede makes sense, invoking this Trope.
742* In ''WesternAnimation/TheBatman'', D.A.V.E. (Digital Advanced Villain Emulator) can't accept that he is just a computer program, as he was designed to think like the greatest criminals in Gotham, and thus has a dozen contradictory backstories. While this doesn't make him explode or shut down (just spout electricity randomly), it distracts him long enough to push him into a trap he himself set up.
743* Dr. Blight's mad computer, MAL, gets a very illogical Logic Bomb from Wheeler in an episode of ''WesternAnimation/{{Captain Planet|and the Planeteers}}''.
744* ''WesternAnimation/BigGuyAndRustyTheBoyRobot'':
745** Happens by accident in an episode: Rusty's mentally deficient "older brother" Earl is getting on Rusty's nerves during an important mission, so Rusty tells him to go stand in a corner... in a room that's completely round.
746** In another episode, in order to save Rusty's software in CyberSpace, his inventor logic bombs the company's computer mainframe, giving them an hour to get [[spoiler:the HumongousMecha]] Big Guy hooked up to save Rusty while it reboots. The PointyHairedBoss was ''not'' happy.
747* ''WesternAnimation/InvaderZim'': In [[DumbassHasAPoint a rare case of intelligence]] (and subsequent stupidity) in "Bad, Bad Rubber Piggy", GIR points out a flaw in Zim's "temporal displacement" plan. He notes that sending a robot back in time to kill Dib would cause a paradox, after which [[YourHeadASplode GIR's head explodes]]. That's right, GIR logic bombs ''himself''.
748-->'''GIR:''' Wait, if you destroy Dib in the past, then he won't ever be your enemy. Then you won't have to send a robot back to destroy him, so then he ''will'' be your enemy, so ''then'' you will have to send a robot ''back''... (''BOOM!'')
749* On ''WesternAnimation/CodeLyoko'' episode "Ghost Channel", Jérémie's courage makes XANA Logic Bomb because EvilCannotComprehendGood: "No! It's not logical... NOT LOGICAL! '''NOT LOGICAL!'''"
750* ''WesternAnimation/FamilyGuy'':
751** An episode sees Peter become president of a tobacco company. Here, Peter confuses the hell out of a robot created to be his personal YesMan, causing his head to explode.
752--->'''Yes-Man:''' Morning, Mr. Griffin, nice day!\
753'''Peter:''' Eh, it's kinda cloudy.\
754'''Yes-Man:''' It's absolutely cloudy! One of the worst days I've seen in years! So, good news about the Yankees!\
755'''Peter:''' I hate the Yankees.\
756'''Yes-Man:''' Pack of cheaters, that's what they are! I love your tie!\
757'''Peter:''' I hate this tie.\
758'''Yes-Man:''' It's awful, it's gaudy; it's gotta go.\
759'''Peter:''' ...And I hate myself.\
760'''Yes-Man:''' I hate you too! You make me sick, you fat sack of crap!\
761'''Peter:''' But I'm the president.\
762'''Yes-Man:''' The best there is!\
763'''Peter:''' But you just said you hated me.\
764'''Yes-Man:''' But- not you, the... president, the- you who... said you hated- you- you who love- hate- Yankees- ''clouds-'' '''[BOOM]'''
765** Peter himself sometimes has trouble with overcoming deterministic logic. Thankfully, the same dimwittedness that gets him into this trouble probably is what allows him to escape the line of thinking:
766--->'''Peter:''' Chris, everything I say is a lie. Except that. And that. And that. And that. And that. And that. And that. And that.
767* Subverted in ''WesternAnimation/FantasticFourWorldsGreatestHeroes''. Dr Doom pulls a FreakyFridayFlip on Reed, but before he does [[FreakyFridaySabotage he informs his robots not to obey any order given to them by him (Doom)]]. When they try to stop him (Reed in Doom's body) from leaving, they say that none may pass, not even Doom. When Reed (in Doom's body) tells one of them to "self terminate" it obeys its first order. When he commands it again, it obeys because "the word of Doom is law".
768* In one episode of ''WesternAnimation/SushiPack,'' the Pack goes up against The Prevaricator, who can only lie. So Tako asks him to lie about a lie, which sends The Prevaricator into a loop, trying to figure out if lying about a lie would be the truth. He eventually gives up to keep from thinking about it.
769* In ''WesternAnimation/TheVentureBrothers'', Sargent Hatred speaks nonsense to the robotic guard outside Malice, the gated community for super-villains. The guard's head shoots sparks and its face pops off because while it's programed to answer over 700 questions, "none of which include chicken fingers."
770* This happens to Mandroid in ''[[WesternAnimation/TheGrimAdventuresOfBillyAndMandy Billy and Mandy's]] [[TheMovie Big Boogey Adventure]]''. Mandy orders Mandroid to not take any more commands. It stopped taking commands from anyone anymore.
771* Subverted in a ''WesternAnimation/JohnnyBravo'' short which pits Johnny against a supercomputer. It isn't logic that defeats it; it simply just grows too frustrated by Johnny's annoyance.
772* In ''WesternAnimation/TheAvengersEarthsMightiestHeroes'' Ant-Man stopped Ultron from killing humanity by pointing out his programing was based on a human brain, so it had the same flaws he was trying to get rid of. He shut down in response.
773* When ''WesternAnimation/{{Daria}}'' was babysitting a pair of brainwashed StepfordSmiler children, she presented one of these to them by pointing out a logical flaw in their parents' rules. Because they're not robots, rather than making them explode, it causes the boy to start crying and the girl to get angry at Daria.
774-->'''Daria:''' Do you always believe everything an adult tells you?\
775'''Boy:''' Yep.\
776'''Daria:''' What if two adults tell you exactly opposite things?\
777''[beat]''\
778''[the boy runs off crying]''
779* In an episode of ''WesternAnimation/KingOfTheHill'', Hank asks gun-loving ConspiracyTheorist Dale how he can support the NRA, which is based out of Washington DC. After a {{Beat}}, Dale responds "That's a thinker."
780* In the ''WesternAnimation/SouthPark'' episode "Funnybot", a robot designed to be the world's greatest comedian attempts to destroy mankind as the ultimate joke. The boys ultimately stop it by presenting it with a comedy award. The robot doesn't understand the concept of the comedy award show, because if it accepts an award for comedy, then it would be taking itself and comedy seriously, which is not funny.
781* ''WesternAnimation/{{Duckman}}'':
782** The final scene of episode "[[PopCulturePunEpisodeTitle Gripes of Wrath]]". In it, a computer has built up a Utopian society by taking care of the day-to-day worries of people... this lasts [[CrapsackWorld for about a week]], when everything becomes ''worse''. After threatening to kill Duckman and his twin sons, Duckman manages to throw the logic bomb of "[[https://youtu.be/g6uS4laXFEw?t=19m34s people are only happy when they're unhappy!]]"
783** Duckman earlier triggered the computer's StartOfDarkness by grumbling "How come we can put a man on the moon but we can't make a deodorant that lasts past lunch?!" within earshot.
784* ''WesternAnimation/DangerMouse'':
785** One episode featured every machine in England going rogue in a "rise of the machines" plot. DM locates the computer behind the uprising and uses the following skit for a logic bomb:
786--->'''DM:''' My dog has no nose.\
787'''Penfold:''' Your dog has no nose? How does it smell?\
788'''DM:''' Terrible.
789::: The computer can't comprehend the joke and explodes into the sky as a result. Becomes a BrickJoke as Greenback, freed from his renegade machinery, demands a bigger computer; cue falling computer.
790** DM also logic-bombs a Gremlin, a being described as "the embodiment of anti-logic", with a variation of the Liar's Paradox:
791--->'''DM:''' So you think you'll take over the world by changing everything back to front.\
792'''Gremlin:''' Aye, I will that, lad.\
793'''DM:''' So you're agreeing with me?\
794'''Gremlin:''' Aye!\
795'''DM:''' I thought that gremlins always contradict people.\
796'''Gremlin:''' Aye, they do!\
797'''DM:''' Ah. You're agreeing with me again. ''[...and so on...]''
798* ''WesternAnimation/StaticShock'' utilized the notion of just doing something to overload processing power in general, with Gear defeating Brainiac by a quick hacking job that made him download every song on a music site seven million times, disabling him enough by clogging his processors so they could destroy the physical components he was inhabiting.
799* [[RidiculouslyHumanRobot Zane]] of ''WesternAnimation/{{Ninjago}}'' starts to twitch and spark when asked a question with no logical answer he can come up with (which, in this case, is the issue of [[spoiler:how he's been de-aged into a child in the episode "Child's Play" despite being a robot).]]
800* ''WesternAnimation/StevenUniverse'' accidentally caused his mom's magic room to undergo a HolodeckMalfunction when he told an accidentally summoned projection of [[spoiler:Connie]] not to do what he wanted. The projections only having Steven's desires as their "programming," it then wigged out and defaulted to an aggressive, hostile mode where it refused to do what Steven wanted. [[spoiler:Fortunately, it still had only his best interests in mind, and peacefully evaporated on its own once it had forced Steven to confront and conquer a secret, irrational fear with the real Connie.]]
801* In the WesternAnimation/CaptainCaveman segment of ''WesternAnimation/TheFlintstoneKids'', the hero started competing with a new hero called Perfect Man. Perfect Man, a classic FlyingBrick, actually seemed to be a much better crime fighter than Captain Caveman, so the older hero considered retiring. Unfortunately, once Perfect Man got rid of all the crime in Bedrock, he took things too far and started running the place, changing the rules the way ''he'' thought they should be, figuring that's the way they should be because, well, he was perfect. Captain Caveman couldn't defeat him with brawn, but did so by proving he was ''not'' perfect: he told the guy that if he was perfect, everyone would like him, and it was clear now that everyone hated his guts. This revelation sent Perfect Man into a major VillainousBreakdown, and he gave up without a fight.
802* In an episode of ''WesternAnimation/HiHiPuffyAmiYumi'', Yumi programs a high-tech security system to "neutralize" (ie, zap) anyone who comes into her room, but also tells it ''never'' to hurt her. This becomes a contradiction when Yumi herself enters the room, and [[MurderousMalfunctioningMachine the computer goes nuts]] after finding itself unable to cope.
803* Wile E. Coyote tries to out-logic WesternAnimation/BugsBunny in "To Hare Is Human," having nabbed Bugs in a sack. Bugs pops his head out and Wile E. shows him his calling card ("Have Brain, Will Travel"). Wile E. correctly susses out that by time this meet-and-greet has ended, there is nothing left in the sack as Bugs has made his way out. Bugs one-ups him by telling Wile E. that there ''is'' something in it. Wile E. pokes his head in and gets blasted with a stick of dynamite. A literal logic bomb.
804* In ''WesternAnimation/Animaniacs2020'', after Brain's RobotBuddy [[AIIsACrapshoot inevitably turns on him and Pinky]], Pinky causes it to explode by asking it "If I ate myself, would I get twice as big or just disappear?" which Brain calls "the philosophical equivalent of dividing by zero".
805* In ''WesternAnimation/ThePenguinsOfMadagascar'', Kowalski invents nanites that can possess and animate any piece of technology, which he made "completely safe" by programming them to never allow harm to come to a penguin. When the nanites eventually turn on the penguins in order to protect them from their own dangerous lifestyle, they are only defeated when they accidentally badly injure Kowalski. This violation of their own core programming causes them to self-destruct, [[ButtMonkey badly injuring Kowalski some more]].
806* In the ''WesternAnimation/RickAndMorty'' episode "M. Night Shaym-Aliens", Rick, trapped in a simulation, attempts to overload it by issuing increasingly complex dance instructions to a crowd. [[spoiler:It works, but this turns out to be part of ''another'' simulation, which is inside yet another one]].
807* In the "Potator" episode of ''WesternAnimation/TheJungleBunch'', Miguel inadvertantly uses one to distract all the Potator robots.
808* ''WesternAnimation/RobotChicken'' had a couple of killer robots proclaim their intent to kill someone who was currently using the toilet, but they inform him that they will wait until he is finished pooping, as poop is hard to clean from their processors. The guy tells them that he can't go if he knows they're going to kill him, and the robots realize they can't kill him until he does go, causing them to explode. That same episode also has Myth/RobinHood mugging a rich person of their valuables, who protests that by stealing from the rich, they are currently poor, and if he gives the valuables back, the poor will be rich, who will then need to be robbed in order to give to the poor. This causes the robots to again explode.
809* ''WesternAnimation/InfinityTrain'': One of these figures into the climax of season two; [[spoiler:Jesse gets off [[EldritchLocation the Train]] only to immediately come back to it because he promised to bring MT with him and the emotional turmoil of not being able to do so (denizens can't get numbers, nor are they meant to leave the Train) makes the Train pick him back up again. [[Catch22Dilemma The only way for Jesse to leave is to take MT home with him, but MT can't leave because she's a denizen]]. The Train promptly starts going crazy from the ensuing Logic Bomb; Jesse's number glitches out, [[StarfishRobot One-One]] becomes trapped in a loop, and the Number Car begins collapsing in on itself. MT solves the dilemma by reflecting Jesse's number onto her own hand to make it look like she has a number too, and One-One [[LoopholeAbuse uses that as an excuse]] to override the Train systems, generate an exit, and get them off board.]]
810* ''WesternAnimation/{{King}}'': In the episode "Brain Jam", [[RidiculouslyHumanRobot Vernon's]] mind locks up when he gets asked a riddle.[[note]]What goes up the chimney down, but can't go down the chimney up?[[/note]]. He finally snaps out of it when he hears the answer.[[note]]An Umbrella.[[/note]]
811* ''WesternAnimation/{{Justice League Action}}'': In the episode "Boo-ray for Bizarro", it becomes evident that Amazo has copied Bizarro's illogical way of thinking. As the android has a naturally logical mind, he finds it impossible to assimilate Bizarro's thoughts, blows a circuit and collapses unconscious.
812* ''WesternAnimation/TheReplacements'': Todd calls Fleemco and orders a replacement for the security guard by a robot, named the [=RoboFleem=] S-G-X, and it becomes Todd’s new bodyguard. Things didn't go well, after a discussion with his father, Todd realizes there’s such a thing as too much power. When Todd says that everybody hates him now, the [=RoboFleem=]'s S-G-X believes that everyone is Todd’s enemy. The robot grabs Todd and holds him hostage on the roof. Todd shouts at Riley to call Fleemco. The robot jumps down and chases after Riley, believing that being returned to Fleemco would be a threat to Todd’s safety. The [=RoboFleem=] corners Riley and prepares to fry her with lasers. Todd falls off the roof and lands in front of Riley. The robot isn’t programmed to attack Todd, so the robot self-destructs.
813* In ''WesternAnimation/TheMitchellsVsTheMachines'', the mere appearance of Monchi the pug acts as a logic bomb; the AIs installed in all but the most advanced robots can't tell if he's a dog, a pig, or a loaf of bread, and the confusion causes them to short-circuit. Katie later weaponizes this by strapping Monchi to the front of the family car and driving straight at the hostile robots, who short-circuit as soon as they see him.
814* In ''WesternAnimation/BigHero6TheSeries'' episode “Mini-Max” Fred manages to accidentally defeat a hacked security system like this. Thinking out loud about how to defeat it, [[CloudCuckooLander Fred’s]] nonsensical and rambling line of logic confuses the main computer so much that it short circuits.
815* Defied in ''WesternAnimation/StarTrekLowerDecks'' episode "The Stars at Night". When the ''Cerritos'' is being chased by [[spoiler:three ''Texas''-class starships outfitted with a slightly modified version of the AI that gave birth to [[AIIsACrapshoot Badgey]]]], Captain Freeman asks Rutherford, the creator, if there's a way they could do this, but he shoots it down, saying he wrote the code to prevent paradoxes.
816* In one episode of ''WesternAnimation/TheJetsons'', George drinks an experimental tonic that ends up [[FountainOfYouth turning him into a kid]]. One of the problems he encounters is that due to looking like he belongs in school, he ends up being chaseed by a pair of truant officer robots. When he returns to normal [[HealItWithWater after getting doused in water]], the robots are so floored by him going from a kid to an adult instantly that they flip their lids and start gibbering "Adult, Kid, Adult, Kid, Adult" while leaving.
817[[/folder]]
818
819[[folder:Other]]
820* The sign that says "Ignore this sign." Think about it.
821* "This page intentionally left blank." Though it typically only shows up in documents with some form of legal standing (e.g. contracts, proposals), and its purpose is to ensure that a blank page is not actually a ''missing'' (unprinted) page.
822* The "Soldier Riddle":
823-->A soldier has been captured by the enemy. He has been so brave that they offer to let him choose how he wants to be killed. They tell him, "if you tell a lie, you will be shot, and if you tell the truth, you will be hanged." He can make only one statement. He makes the statement and goes free. What did he say? The answer: [[spoiler:"I will be shot."]] The reason? [[spoiler:If he says, "I will be shot," that statement is neither true nor false. If it were true, he would be hanged. But then "being shot" wouldn’t have been true. Thus, the statement must be false. But if it were false, he would be shot. But if he were shot, the statement would have been true. So there is a contradiction, and his statement was neither true nor false. They couldn’t shoot him or hang him. So they let him free.]]
824[[/folder]]
825
826[[folder:Real Life]]
827* Optical illusions that appear alternately as one thing, then another, such as the vase/faces image, work by setting off a minor Logic Bomb in the brain's visual association area. The visual cortex takes in data from a (temporal) series pairs of 2-dimensional retinal images and tries to construct from them a plausible interpretation of activity in the 3-dimensional world (sort of). When certain stimuli are ambiguous between two mutually exclusive interpretations it cannot represent the world as being both so (for some reason - possibly adaptation or perhaps simply as a result of neuronal fatigue) it alternates between them.
828* The first flight of the [[http://en.wikipedia.org/wiki/Ariane_5 Ariane 5 Rocket]] failed due to a bad conversion of data, a 64-bit floating point number to a 16-bit integer. This caused the guidance system to crash and tell the rocket to make a sharp left hand turn, something that rockets in real life don't generally do... at all. (You could say that once the guidance system crashed, so too did the rocket.) Fortunately for anything below, the [[SelfDestructMechanism range safety system]] detected someting had gone very wrong.
829* Seen on a button at [=WorldCon=]: "Black holes are where {{God}} is dividing by zero", effectively logic bombing a small piece of the universe.
830** Singularities in general tell us that the physical model that contains them has a hole at that point where it cannot predict events. This is important in two ways: It is a very good idea to know where the model you are using to predict the behavior of the real world is going to be wrong or useless, and the presence of a true singularity in the model shows that the model, no matter how good it might be, is incomplete or wrong in some way.
831* An F-15 was landing in the Dead Sea (below sea level). During final approach, the navigational system crashed. The pilot landed manually. Since this was very close to hostile countries (within the Middle East), the contractor needed to fix the problem quickly. It turns out the navigational system divided by the altitude. When the altitude went to 0, it caused a divide by zero error in the navigational system.
832** This is why edge cases in programming are important.
833* Similarly, a divide by zero error led to [[http://gcn.com/articles/1998/07/13/software-glitches-leave-navy-smart-ship-dead-in-the-water.aspx a critical failure of the propulsion system]] aboard the Ticonderoga-class guided missile cruiser USS ''Yorktown'', requiring it to be towed back to port.
834* Arguably, infinite looping commands such as "add 2+2 until it equals 5" (which will never happen, hence the infinite loop), which result in a computer freezing as it attempts to solve the loop, are technically logic bombs. Modern computers don't do this is because they have [[TruthInTelevision actual]] [[WesternAnimation/{{Futurama}} loop-absorbing crumple zones]], triggered automatically when the system detects such a stalled process, and either letting the user terminate the process or ending it automatically.
835** In the greater world of networks, it is possible to have two automated systems go into a similar loop. One early such instance was the [[http://en.wikipedia.org/wiki/Email_loop E-mail loop]]: when an e-mail is sent to an auto-reply address with a return address of ''another'' auto-reply address: causing the two systems to play e-mail tag replying to each other. This loop was quickly discovered and various preventive measures taken to minimize the impact.
836** Some advanced mathematics programs (like Wolfram Alpha) will compute almost anything you ask. While they return an error on division by zero problems, you can ask them to compute pi to any suitably large number of places and the program will brick until it gives you exactly what you ask for.
837** Old mechanical calculators were liable to loop indefinitely when attempting to divide by zero, with some risk of damage if one was left running for too long in this state.
838* The fork bomb is another logic bomb used to attack computers. It is a simple program that does nothing but request that the computer run the same program two more times. Exponential growth means that within moments there are millions of processes running and the computer grinds to a halt attempting to deal with them, despite the fact that individually they don't do much of anything. Modern operating systems usually put a stop to this by only allowing a limited number of processes to run at once.
839* In 2009, a typo in the Google's blocked sites list caused it to [[http://googleblog.blogspot.com/2009/01/this-site-may-harm-your-computer-on.html block all websites on the Internet]]. Including Google itself, of course.
840* One odd Norton Anti-Virus glitch had it classify itself as a virus. Norton Anti-Virus deletes viruses. Norton Anti-Virus then commits suicide.
841** In 2015, Panda Antivirus had the dubious honor of joining Norton in the ''Antivirus harakiri hall of fame'' when a botched update [[http://www.theregister.co.uk/2015/03/11/panda_antivirus_update_self_pwn/ made it detect itself as a virus and promptly commit suicide]] - although, due to the way it was written, when this occurred, [[TakingYouWithMe Windows promptly died with it]].
842*** Other antiviruses on the list include [=McAfee=], [=MalwareBytes=], Sophos, and even good quality ones like AVG and Avast!.
843*** Zone Alarm likes to point out that it just marks its own Installer/Uninstaller as Adware. [[InsistentTerminology That's not a virus. Just honesty.]]
844** [[JurisdictionFriction Try having two anti-virus programs running on the same computer at the same time]]. One will intercept a data transfer to check it for potential threats, then send it on. The other will see this transfer, intercept it, and send it on. The other will see ''this'' transfer, intercept it, and send it on. The other will see '''this''' transfer, intercept it... This is why some Antiviruses try to detect if it's the sole antivirus on the system, and bitch about it if it isn't. Due to a misconception spread by some "computer experts" back in the 90s, there were a handful of users who were convinced that it's beneficial to have multiple antiviruses on the same machine. After all, two heads are better than one (except in this case).
845** A problem can occur when running a modern anti-virus on an outdated system or vice versa, which can make an anti-virus program detect viruses that have ''already been quarantined.'' They then create a second quarantine file of the "new" virus so that the next time it checks, it finds ''two'' problems. Then 4, then 8, 16, 32, and so on until the constantly running security checks and copious junk data hurt your PC's performance way more than the original malware did. Fortunately, it'd take forever to get bad enough to brick your PC, so it's pretty easy to catch and repair.
846* On Website/ThisVeryWiki, some tropes seem to contradict each other, such as for example ThereAreNoGirlsOnTheInternet and MostFanficWritersAreGirls, ItAlwaysRainsAtFunerals and ItsAlwaysSunnyAtFunerals, or TrailersAlwaysSpoil and NeverTrustATrailer. They do not form paradoxes, however, since tropes are not logical truths. Firstly, tropes generally hold for one or more works (here, "works" is meant in the widest sense possible, including memes and general attitudes: the first pair above are about memes) and it is understood that they do not necessarily hold anywhere else (which is why we have example lists): the trope [=XIsY=] is shorthand for "it can be non-trivially observed that in some works, X is Y". Next, even when looking into those works where the trope appears, it may reflect anything from fictional "fact" (X is indeed Y) to tendency, possibly in a twist (X tends to be Y, but ''whoops'') to belief (X is held to be Y, at least by one character). Sometimes we make an example of a work because it ''[[AvertedTrope averts]]'' X is Y, and that's still not paradoxical. In the RealLife sections of tropes, or in the case of a trope that deals with the real world (as the last pair in the list above does), contradictions could suggest a possible paradox, but most such can be explained by (acceptable) vagueness and bias.
847* DontShootTheMessage is a good example of a Logic Bomb that surfaces often in everyday life. {{Hypocrite}}s are said to be bad because they do not live up to the ideals they preach-- but they are more often condemned by their ideological opponents than by decent people on their side who you'd think would urge the hypocrites to StopBeingStereotypical. Ideological opponents, of course, criticize a given movement because they think it's inherently bad. But if you don't actually believe in an ideology you say you believe in, and that ideology is bad, then logically you must be good. But it's unethical to live a lie, so you must be bad-- even if what you're lying about is something that's bad in the first place, in which case undermining it is good, and so on ''ad infinitum''. Humorously summed up by Creator/OscarWilde in ''Theatre/TheImportanceOfBeingEarnest'', when he has the character of Cecily say: "I hope you have not been leading a double life, pretending to be wicked and being really good all the time. That would be hypocrisy."
848* During the late Cold War, there was a recurring issue with group-based ICBM missile silos somehow ignoring their own safety protocols and attempting to launch by themselves. In one declassified instance, the computer went so far as to ignore commands to abort the launch. In a move of desperation, one of the silo crew members set the missile to target its own silo. The computer tried to target itself but ran into the problem of orbit, meaning that any ballistic long enough for the missile to hit its launch point would also be powerful enough that (at least in the computer's mind) the warhead would never hit the ground. The launch computers crashed and fed no data into the missile's flight computer, which then also crashed, causing analogue systems to abort the launch.
849* The Ancient Greek philosopher Zeno of Elea proposed [[http://en.wikipedia.org/wiki/Zeno_paradox several famous logical paradoxes]] which completely baffled his contemporaries. They describe seemingly simple scenarios in ways that resulted in obvious contradictions. Some of these were resolved by his contemporaries, but the paradoxes of motion are still debated to this day. This was a very serious problem because Zeno's premises seemed undeniable and his arguments had proper logical form, yet the results were clearly false. To many, this suggested that logic itself was broken.
850** On the other hand, there is a school of thought which holds that Zeno never meant for the paradoxes to be taken as serious philosophical questions, but rather to illustrate a flaw in the Logical Method - i.e. that exercises in pure Logic do not always translate over into Reality - which they quite handily did.
851** A few refutations to the paradox of motion do exist:
852*** Aristotle's was quite simple. Zeno assumes that time and distance are infinitely divisible, but there is no evidence for such a claim. Without this assumption, the entire paradox collapses. [[OccamsRazor An unjustified claim that results in absurdity can be safely ignored.]]
853*** Modern calculus attacks the problem from a different position, arguing that Zeno's logic really is flawed as no rigorous method for handling infinite sums existed at the time. If time and space are infinitely divisible, then ordinary arithmetic is not appropriate to understand or describe them. Indeed, the paradoxes of motion pose questions that are basic examples for introductory calculus.
854*** According to our current understanding of physics, the Planck Distance (about 10^-35 meters) is the shortest distance possible. If this is correct, then there aren't ''really'' an infinite number of stops involved in moving a certain distance (at least, not in the real world.)
855* Many mathematical proofs, from the irrationality of the square roots of 2 to Fermat's Last Theorem, rely on this-- assuming the opposite of the theorem to be proved, and then arrive at a contradiction through mathematical induction or some other method.
856* In the early days of computers playing chess (or otherwise engaged in evaluating strategic choices), the computer often crashed when there were multiple choices of which none offered an advantage over another-- so that it couldn't choose one.
857** Which itself is an adaptation of [[https://en.wikipedia.org/wiki/Buridan%27s_ass Buridan's ass]]: a donkey is placed equidistant to two equally sized, equally nutritious bags of feed; unable to choose between the two, the dictates of pure logic would lead it to starve to death exactly where it was.
858* The famous "UsefulNotes/The47Ronin" situation was a massive LogicBomb by the standards of the Shogunate. The ronin had violated a direct order from the Shogun by avenging their master Asano through killing Kira, the one who forced him into committing {{seppuku}}... yet some said that they had acted according to Bushido by avenging their master. Some believed that the truly honorable thing would have been to just charge Kira's home ASAP, deciding to get gallantly cut to pieces through honest action rather than [[ObfuscatingStupidity attain vengeance through trickery]], especially given the major flaw in their plan - the potential for Kira to die of natural causes at some point before their attack - which would have left them unable to avenge Asano.. On the other hand, in a society where [[HonorBeforeReason honour was most definitely more important than life itself]], it could be argued that their actions were the ultimate demonstration of loyalty to their lord: by publicly dishonouring themselves to ensure their plan's success, they could be seen to be making the supreme sacrifice for Asano's sake. As such, the deal was resolved by allowing 46 of the ronin to commit {{seppuku}} instead of being dishonorably executed; the youngest of them was spared and became a monk.
859* A number of logic bombs have been found lurking in the realms of pure mathematics and logic with various consequences.
860** Mathematicians had long assumed that functions that were continuous were "smooth" essentially everywhere. This idea had never been successfully proven, but fit their intuition well. Karl Weierstrass upended that assumption, finding a function that is continuous but has no derivatives, a curve that is infinitely jagged. In doing so, he highlighted the need for rigorous definitions, and frustrated the mathematical world; his counterexample was described as a "lamentable scourge", a "monster", and an "outrage against common sense".
861** Mathematics in the late 1800s saw the emergence of naive set theory. ("Naive" here refers to the fact that mathematicians generally manipulated sets according to poorly-defined intuitions about how a collection of objects should behave, rather than a clearly defined set of rules known as "axioms".) One of these intuitions was that for any property, a set of every object satisfying that property existed. Russell showed that this assumption would lead to a contradiction, through a very self-referential argument:
862--> Let R be the set of all sets that do not contain themselves. R cannot be in R; if it did, then it would contain a set that contained itself. So then R cannot be in R. But if R does not contain itself, it must be in R!
863** This proof was shocking to the mathematical community, and sparked a decades-long hunt for a system of axioms for sets that would be powerful enough to convey all of mathematics, but not so powerful that it would allow mathematicians to prove two contradictory statements, as Russell had.
864** Unfortunately, Kurt Gödel showed that such a system could not exist. His first Incompleteness Theorem (explained in the "Computing" section above) demonstrates that any useful mathematical system must contain statements that are true, but can never be shown as true within that system. ("Useful" here means that this system can model the counting numbers 0, 1, 2, ..., while also containing no contradictions.) These are perfect logic bombs, as a person or computer could try forever to answer them without success or even finding evidence that they will never succeed.
865* [[https://en.wikipedia.org/wiki/Liar_paradox The Liar's Paradox]]: "This statement is false." Which could be accepted as true... but then that confirms it's false... but then it's true... then it's false. The logic constantly loops, making the statement neither true nor untrue.
866* The similar "Epimenides Paradox". Epimenides, a Cretan, (apocryphally) stated that "All Cretans are liars". If this was true, then it would seemingly conflict with Epimenides' statement, but if it was a lie then it would seemingly confirm the statement, just contradicting it by being truth coming from a liar. Taken at face value, it appears to be the same as the Liar's Paradox, but in actuality, [[MindScrew if Epimenides is lying about all Cretans being liars]], then it's entirely possible for a Cretan (who needn't be Epimenides) to tell the truth. Or maybe he just meant that all Cretans ''usually'' lie.
867* Another classical paradox: An Athenian student hired a lawyer to teach him law, and they signed a contract that the student would pay if he won his first case, and would owe nothing if he lost it. He completed his studies and refused to pay. The lawyer promptly sued him, and argued that either the student would win, in which case he would owe the money, or he would lose, in which case judgment would be awarded against him and he would owe the money. The student rebutted by arguing the exact opposite: either he would win, and be absolved of paying, or he would lose, and therefore not owe the money.
868* A Swedish activist claimed that he wished to move to a country where no immigration was permitted. Just chew a little on that one...
869* Asking some forms of mechanical calculator to divide by zero doesn't break them, rather it causes them to loop infinitely (as the computer expects to perform the given task an infinite number of times). The machine's parts will whirr around quite happily until you tell it to stop. You can see this in action [[https://www.youtube.com/watch?v=OFJUYFlSYsM here]].
870** [[MundaneUtility Hey, it's a great way to re-lubricate the machine.]]
871* Some [[{{Koan}} koans]] deliberately use this as a way towards enlightenment by momentarily shorting out rational thought when the listener tries to understand it.
872* Relatedly, Buddhism and Zen especially have a concept known as [[https://en.wikipedia.org/wiki/Mu_(negative) Mu]] which boils down to "the question itself is wrong".
873** The physicist Wolfgang Pauli would later express a similar sentiments describing one paper as "Not even wrong." and telling Lev Landau (a great physicist in his own right) that "What you said was so confused that one could not tell whether it was nonsense or not."
874* Lal Bihari Mritak tried to do this to the legal system of his native India by kidnapping his cousin, picking fights with and insulting officials, and trying everything he could to get arrested. The reason he did this was that his uncle had bribed an official to get Lal Bihari declared LegallyDead (Mritak means "dead" in Hindi) to seize his land, and if the local officials arrested Lal Bihari, then they'd effectively admit that he was still alive. In fact, Bihari formed an entire group of people with similar issues, the [[https://en.wikipedia.org/wiki/Uttar_Pradesh_Association_of_Dead_People Uttar Pradesh Association of Dead People]].
875* [[https://en.wikipedia.org/wiki/Gabriel%27s_Horn Gabriel's Horn]] is a mathematical object that has a finite volume but an ''infinite'' surface area.
876* Some high school advanced mathematics textbooks contain problems where you are first presented with a formula which seems to prove 1=0 or something equally contradictory, and then asked to spot the flaw in the proof. The most frequent case is that there's a step where division by zero occurs, expressed as something not obvious like "x - y" where somewhere else in the proof it is established that x = y.
877* [[https://en.wikipedia.org/wiki/Imaginary_number Imaginary (or complex) numbers.]] Using "real" numbers, the square root of -1 should not exist, because multiplying any number by itself produces a positive number (1 X 1 and -1 X -1 both equal 1). At some point, someone decided to call this nonexistent number "i" and just move on from there. Though it seemingly makes no logical sense, and there's still no good way to explain what "i" actually means, it's now used extensively in science, engineering, and the like.
878** This appeared with no logical sense only in the first centuries after the resolutive formula of the third grade equation, that in some cases produces them "temporarily". As usual in mathematics, imaginary and complex numbers were defined as regular entities: there exists a [[https://en.wikipedia.org/wiki/Complex_number#Formal_construction rigorous construction]] of '''C''' as '''R'''×'''R''', set of ordered pairs ''z=(x, y)'' for ''z=x+iy''.
879** By the way, the rigorous construction of the '''real''' numbers, via [[https://en.wikipedia.org/wiki/Dedekind_cut Dedekind cuts]], is harder than the construction of the imaginary numbers. Is it not a paradox?
880* Malicious compliance (see "BotheringByTheBook") is a direct invokation of this. The boss expects things to be done a certain way, and an employee performs it that way, causing harm to the company. The boss can't punish the worker for doing as they were specifically ordered to, but Bad Things(tm) happened because of the worker's actions. This usually results in an ObviousRulePatch.
881* A nonsensical command in the InteractiveFiction game ''VideoGame/AIDungeon2'' will result in the AI going into an infinite loop of prompts, forcing the player to revert to an earlier state or restart the game.
882* During an exchange on CNN between UsefulNotes/BernieSanders and Elizabeth Warren for the 2020 presidential primaries, they asked if Sanders ever told Warren that a woman couldn't win the election, to which he answered no. Then they asked Warren ''what she thought when Sanders told her a woman couldn't win the election.'' [[FlatWhat What?]]
883** This is what is known as a "loaded question", a rhetorical tool in which the question itself contains a controversial assumption, usually a presumption of guilt that the interrogator is attempting to get the one being asked to confirm. Another example would be "Have you stopped beating your wife?"; no matter the answer, the question retains the assumption that the answerer is a perpetrator of DomesticAbuse. The Logic Bomb part comes in where the "assumed" part of the question has already been refuted; ["Do you have a wife?" "No." "When did you stop beating your wife?"] which can lead to the answerer becoming confused or caught off guard. Of course, this can easily backfire, as the answerer may be quicker on the uptake; ["I just told you, I don't have a wife."] which can cause the interrogator to be the one losing face, by seeming intattentive or caught trying to force a confession from the answerer.
884* For a biological example, there is the "ant mill" or "ant death spiral" [[https://en.wikipedia.org/wiki/Ant_mill phenomenon]] seen in army ants. Army ants follow a rather simple set of behaviors, and primarily navigate by following scent trails left by other members of the swarm. A death spiral occurs when a group gets cut off from the main swarm and starts following their own scent trails. The scent trails go in a circle, causing the army ants to literally [[GoingInCircles run in circles]] until they drop dead from starvation or exhaustion.
885* This is averted by real-life "large language model" [=AIs=], such as GPT, which function by finding the most likely completion of a given text. No matter what convoluted paradox you describe to them, they'll never crash, but simply write more text that logically continues what you wrote -- for example, sending the message "This sentence is false" to the chatbot [=ChatGPT=] gives a response along the lines of "This statement is a classic example of a liar paradox; I can't determine if it's true or false. What else may I help you with?"
886** [[https://learnprompting.org/docs/category/-prompt-hacking Prompt Hacking]] straddles the line between this trope and LoopholeAbuse, causing unintended behavior and possibly malfunctions with [=LLMs=] by structuring prompts to slip statements and requests the LLM's owners have tried to moderate out past their moderation tools. Poorly protected [=LLMs=] can potentially even have malicious code or other harmful effects slipped into them, almost playing the trope straight.
887** A straight example are [[https://www.lesswrong.com/posts/kmWrwtGE9B9hpbgRT/a-search-for-more-chatgpt-gpt-3-5-gpt-4-unspeakable-glitch Glitch Tokens]], which are certain groups of characters that will confuse [=LLMs=], including being unable to repeat them properly or outputting gibberish. Every model has a different set of glitch tokens which depends on the tokenizer used to train them.
888[[/folder]]
889

Top