%% Image selected per Image Pickin' thread: http://tvtropes.org/pmwiki/posts.php?discussion=1305357805083384500
%% Please do not change or remove without starting a new thread.
%%
[[quoteright:350:[[VideoGame/{{Portal 2}} http://static.tvtropes.org/pmwiki/pub/images/know_your_paradoxes2_9503.jpg]]]]
[[caption-width-right:350:Yeah, good luck with #3.[[note]]This is a variation of [[http://en.wikipedia.org/wiki/Russell%27s_paradox Russell's paradox]], originally "does the set of all sets that are not members of themselves contain itself?" The change is not a mistake: to get around this, most formulations of set theory contain some variant of the [[http://en.wikipedia.org/wiki/Axiom_of_regularity axiom of regularity]], which essentially states that no set can contain itself, making "the set of all sets" an impossible construct, similar to the liar's paradox above, since it would necessarily contain itself.[[/note]]]]

->''"I cannot -- yet I must! How do you calculate that? At what point on the graph do 'must' and 'cannot' meet?"''
-->-- '''Ro-Man''', ''Film/RobotMonster''

Is your [[InstantAIJustAddWater sentient supercomputer]] [[AIIsACrapShoot acting up]]? Good news. There's an easy solution: confuse it.

If you give a computer nonsensical orders in the [[RealLife real world]] it will, generally, do nothing (or possibly appear to freeze as it loops eternally trying to find a solution to the unsolvable problem presented to it). In fiction-land, however, [[ExplosiveOverclocking it will explode]]. It may start stammering, "I must... but I can't... But I must..." beforehand. The easiest way to confuse it is with the Liar paradox, i.e. "this statement is a lie". A fictional computer will attempt to debate and solve the paradox until it melts down. If the computer is a robot, this will probably result in YourHeadASplode.

Paradoxes and contradictory statements (especially contradictory orders) have become the primary material used to build the Logic Bomb and thus the standard way to defeat any sophisticated, computerized system or AI. ''Be warned''; if the Logic Bomb [[TooDumbToLive fails to destroy]] [[{{VideoGame/Portal2}} the system outright]] (and in some cases, [[GoneHorriblyRight even when it does]]), the system's surviving remnants may go insane and attempt to kill you just the same.

Also note that RidiculouslyHumanRobots (and some very advanced [=AIs=]) are generally able to recognize and defuse logic bombs on sight, long before they go off (and may be GenreSavvy enough to view this as a particularly irritating kind of FantasticRacism). Some [[TooDumbToFool ridiculously dumb AIs]] are also immune to logic bombs by virtue of not ''understanding'' the concept of paradox - a sort of inverted case of AchievementsInIgnorance.

Occasionally the way to shut down such a computer is less like a few odd statements, and more like an advanced philosophical debate on the nature of truth, free will and purpose. The end result is still a super computer muttering an error several times before [[ExplosiveInstrumentation exploding]].

Of course if writers had bothered to do their research they would learn that computer software is often vulnerable to being fed inputs that cause buffer overflows or inject commands. Of course these don't cause the machine to explode, but instead places the computing device [[MindManipulation entirely under your control]]. They can also be bogged down or [[HeroicBSOD blue]][[http://en.wikipedia.org/wiki/Blue_Screen_of_Death screened]] with programs such as [[http://en.wikipedia.org/wiki/Fork_bomb fork bombs]] (each instance of the program opens two more). However, things like buffer or stack overflows are artifacts of our current underlying computer hardware architectures and it's very much realistic that such things won't exist in future computer systems. Overload attacks are probably always going to be realistic, though.

If you want to do this to a well-organized group of people, use an AppleOfDiscord instead.

When the logical error outright retcons someone or something out of existence is PuffOfLogic. A Logic Bomb that undoes reality itself is a RealityBreakingParadox. A TemporalParadox might be the cause.

For when the player does this to a video-game AI, see AIBreaker.

For the human equivalent, see some of the entries under BrownNote.

Not to be confused with LogicalFallacies (though some Logic Bombs use the fallacies listed in that page). For a similar mutually negating pair of principles, see Catch22Dilemma.

See also ReadingsBlewUpTheScale and ExplosiveInstrumentation.

----
!!Examples:

[[foldercontrol]]

[[folder:Anime & Manga]]
* ''Anime/GhostInTheShellStandAloneComplex'': The Tachikomas ([=AIs=] themselves) temporarily confuse a lesser AI with a variation of the [[http://en.wikipedia.org/wiki/Epimenides_paradox Epimenides paradox]], and comment on how they are advanced enough to know there is no answer.
-->'''Tachikoma''': Folks that can't handle a self-reference paradox are real suckers.
** Incidentally, this segues from a conversation about the UncannyValley, discussing as how they are obviously non-human, humans feel more comfortable giving them more advanced intelligence since they seem less threatening. Contrarily, the more human-looking speaker robots have less advanced A.I. which cannot resolve Logic Bombs because an A.I. that looks like them and can potentially outsmart them makes humans uncomfortable.
* Quasi-example in ''Anime/CodeGeass'': Rolo the {{Tykebomb}}, who might as well be nonhuman, suffers one of these when Lelouch's manipulations clash with nabbing his mission objective, C.C., who is right in front of him. Non fatal example, but the mental standstill is fun to watch.
* Done hilariously in ''{{Gundam 0083}}''. Kou is so nervous that when Gato berates him on the battlefield about not acting like a grunt by seeing the big picture (meant as an insult), Kou actually takes the comment as sincere advice and tells Gato "Y-yes sir.". This is immediately followed Gato's brain breaking at the sheer illogic by actually having to remind Kou he's the enemy, idiot!
** The look on his face when it happens makes it that much ''funnier''.
* ''Anime/MobileSuitGundamUnicorn'' has this as the final straw that breaks Marida Cruz's mental conditioning: [[spoiler:as of one Elpeo Ple's clones, she's been heavily conditioned to see any and all Gundams as the enemy. When it's pointed out that she is currently flying a Gundam herself, her mind breaks at the resulting contradiction, allowing her true self to emerge again.]]
* A metaphysical example in ''VisualNovel/UminekoNoNakuKoroNi'': To defeat an enemy witch, Beatrice the witch [[spoiler:apparently]] suicidally denies the existence of witches in the LanguageOfTruth. The result looked like a detonating Logic Bomb.
** Played straight in [=EP6=] where [[spoiler:Battler attempts to explain how all the murders were done and he was the culprit, and ends up trapping himself in one of his own closed rooms. Erika and Bern then take advantage of this situation.]] Another example more fitting to the above picture, [[spoiler:Battler and a revived Beatrice prove that there are only 17 people on the island and demand that Erika, the 18th person, explain her existence. She can't, so she dies/gets whisked away to the worst fragment by Bern.]]
* In ''[[Manga/LostBrain Lost+Brain]]'', super-hypnotist [[UtopiaJustifiesTheMeans Hiyama]] accidentally creates one when he simultaneously "programs" his victims with both a strong will without weaknesses and absolute submission to him (a la [[Film/MontyPythonsLifeOfBrian "Yes! We are all individuals!"]]). The "bomb" goes off when Hiyama orders his thralls to kill heroic hypnotist Kounji, and one of them also happens to be holding a [[StuffBlowingUp detonator]]...
* In ''[[LightNovel/HaruhiSuzumiya The Disappearance of Haruhi Suzumiya]]'', [[spoiler:Yuki Nagato decides to [[AlternateUniverse reset herself (and the rest of the universe)]] because she cannot [[ButterflyOfDoom accurately simulate what she's going to do after she learns the result of said simulation]], given that her simulation is constructed from information based on the result of the simulation.]]
* In ''Manga/{{Grey}}'', the protagonist defeats the MasterComputer Toy, who thinks to be a god and wants to exterminate all of humanity, by asking him how can he be worshiped if there is nobody left to believe in him. This stuns momentarily the AI, just as long as to let Grey deal the final blow on it.
[[/folder]]

[[folder:Art]]
* A painting by the Belgian René Magritte [[http://upload.wikimedia.org/wikipedia/en/b/b9/MagrittePipe.jpg "La trahison des images"]], which says "This is not a pipe" underneath in French. It's a ''picture'' of a pipe. Actually, it's just paint on canvas that we recognise as a pipe. Well, usually it's ink on paper arranged to resemble the paint on a canvas that we recognize as a pipe. Unless you're looking at it now, in which case it's RGB pixels on a screen that look like the ink etc. etc.
[[/folder]]

[[folder:Audio Play]]
* Parodied in Creator/TheFiresignTheatre's ''AudioPlay/IThinkWereAllBozosOnThisBus'' when the Clem-clone gives Dr. Memory logic indigestion by asking "why does the porridge bird lay her eggs in the air?"
[[/folder]]

[[folder:Comic Books]]
* In ''ComicBook/{{Runaways}}'' v2 #23, Chase casually asks the cyborg Victor whether God could make a sandwich so big He couldn't finish it, causing Victor to stammer, emit a series of ones and zeroes, go ExplosiveInstrumentation, and pass out. Chase explains that Victor was programmed to be both super-logical and super-spiritual; there are three questions that will cause him to short out, but each will only work once. Each question also has a counter-answer, in case he needs to be revived. The answer to this question is, "Yes, and then He'd finish it anyway."
* ''ComicBook/MarvelAdventures ComicBook/FantasticFour'', basically Fantastic Four stories for younger readers, has Mr. Fantastic do this when challenged to defeat the "ultimate alien supercomputer". When the computer, which supposedly is nigh-omniscient, says there is nothing it can't do, Mr. Fantastic tells it to create a rock so big it can't pick it up. The computer is sparking metal in ten seconds.
* Parodied in ''TopTen'':
-->'''Irma Geddon''': You know, you [=AI=]s are almost too cute. How do I unplug you when you take over the world?
-->'''Joe Pi''': Ask me the purpose of existence, and I explode.
* ''FoxTrot'': Jason once asked his mother if Marcus could sleep-over. She said that it was all right with her, if it was all right with his father. Asking his father, he's told that it's fine with him, if it was all right with his mother. After the BeatPanel, he's shown consulting several logic books.
** An earlier strip featured Paige entering the same situation and just telling her friend "yeah, it's all right."
* The long-running Brazilian comic series Pt/TurmaDaMonica liked to use this now and then, often resulting in a CrowningMomentOfFunny. One recent, but nonetheless hilarious use of this happened when the gang was confronted by an {{Expy}} of none other than [[Videogame/FinalFantasyVII Sephiroth]]. He appears, [[ArrogantKungFuGuy saying how ridiculously powerful he is, flying incredibly high]] until one character asks "If you only got a single wing, how come you can fly just fine?". Cue OhCrap and the {{Expy}} [[GravityIsAHarshMistress falling to his doom]].
* ''SonicTheComic'' featured Predicto a robot which could predict Sonic's movements thanks to its encyclopaedic knowledge of his personality and tactics, which was effective - until [[{{Determinator}} Sonic]] ''surrendered''.
* In ''PrinceValiant'' the prince and his adventuring crew become prisoners on an island with an [[AGodAmI all-knowing oracle]]. The only way off is to ask a question the oracle doesn't know the answer to. After many days of endless questioning the prince finally comes up with the answer: "Why?"
* In ''TheAuthority'', The Midnighter normally begins a fight by simulating it over and over on the supercomputer in his head until he knows everything his opponent might do. An attempt to use this on SelfDemonstrating/TheJoker, however, resulted in the Midnighter just [[ConfusionFu standing there and staring blankly]].
* In ''Comicstrip/{{Peanuts}}'', Linus subjects himself to a self-inflicted Logic Bomb with his belief that the Great Pumpkin always rises from the most sincere pumpkin patch on Halloween night. The moment he thinks to question whether his patch is sincere ''enough'', he's blown it: if he tries to change anything to make it more sincere, he'll only be expressing his own doubts and reducing the sincerity of his faith in the Great Pumpkin.
-->'''Linus:''' ''[to the other kids, [[ScrewThisImOuttaHere who are leaving]]''] I'll put in a good word for you if he comes...[[OhCrap Oh, no!]]...I mean, ''when'' he comes!
* ''SquadronSupreme'' has supervillains brainwashed to work for the titular Squadron, with the mental directive implanted into their minds that they shall not betray any of their members. What happens when one of them witnesses a member of the Squadron working against the others? The mind gets locked into a loop, since revealing the information means betraying one member, while keeping it secret means betraying everyone else in the Squadron.
* Lampshaded and defied by a legion of robots fighting with PowerGirl in a BatmanColdOpen.
-->'''Unimate''': [[foldercontrol]]

[[folder: Unimate has come to cleanse the Earth of the imperfect organic matter known as Kryptonian. Kryptonian is imperf-- ]]
\\
'''Power Girl''': No! ''You'' are imperfect! You must cleanse the Earth of your''selves''!\\
'''Unimate''': [[/folder]]

[[folder: Failure-- Unimate is programmed to reject stratagems from old "''Franchise/Star Trek''" episodes. ]]
\\
'''Power Girl''': Aw, nuts. Worth a try, anyway.
* 1980's British science fiction comic ''Starblazer'', issue 153 "The Star Destroyers". The Vonan {{AI}} known as the Magister believes itself to be all-powerful. It is defeated by Galactic Patrol agent Al Tafer when he tells it it isn't all powerful because it can't destroy itself. This drives the Magister crazy and causes it to blow up the Vonan system's sun, destroying itself and the Vonans as well.
* This is how SelfDemonstrating/TheJoker is defeated in the story ''ComicBook/EmperorJoker''. Joker, who had managed to steal Mr. Myxyzptlk's magic and was about to [[RealityWarper rewrite all the universe]], was facing a dying Franchise/{{Superman}} who asked him why he gave so much importance to Batman, and when Joker found himself unable to cancel it from existence our hero proceeded to ask him how he could be all-powerful if he couldn't even wipe out a man's existence. A split second later, Superman was able to take back his heart and Mr. Myxyzptlk recovered his magic to bring everything back to normality.
* Sebastion Shaw once outfitted a series of Sentinels with this in the event that he could use it to destroy them if they were turned against him. The logic being that, as Sentinels derived from the original Mark-Is, they have evolved and grown stronger, thus they are Mutants. Since they are Mutants, they must be destroyed. Sadly, it doesn't work out that way as Loki fuses three of them into the deadly Tri-Sentinel and when Shaw uses it, it only confuses them long enough for ComicBook/SpiderMan to use his recently gained Captain Universe powers to take them out.
* ''ThePowerpuffGirls'' #40 (DC Comics run), "Everything You Know About The Powerpuff Girls Is Wrong," has the students of Pokey Oaks Kindergarten coming up with their own versions of how the girls were created, paralleling the origins of Superman, Spiderman and the Fantastic Four. When the girls intercede and tell how they were ''really'' created, their teacher Ms. Keane gives them failing grades because the students were tasked with trying to be creative in their tales.
[[/folder]]

[[folder:Computing]]
* This trope is actually misleadingly named. [[http://en.wikipedia.org/wiki/Logic_bomb Logic bombs]] refer to deliberate code hidden in programs which, when triggered, negatively affects its functionality---a simple 30-day trial expiring and disabling itself is a low-key example. TheOtherWiki has bigger and nastier ones.
* Mainstream operating systems are vulnerable to a simple one: the Fork Bomb. It consists on giving the computer an order that creates copies of itself, and the copies create copies, too, until the computer is too busy copying instructions and running them to do anything else.
** There's also a variant that just allocates a lot of memory. With enough privileges, the computer gets too busy swapping in and out the programs running to do anything else.
*** And if you're running a program that can allocate variables dynamically, failing to free up that variable creates memory leaks. Not a problem with modern OS's, but for embedded systems, you better watch yourself.
** "Grey goo" attacks, similar to the "fork bomb", have also been used successfully -- at least twice -- in ''SecondLife'', by users creating objects which (self-)replicated at a rapid rate, eventually causing the servers to be too busy processing the grey goo to do anything else.
*** A mile-high Jenga tower will also crash ''Second Life'''s servers quite effectively: [[WreakingHavok pull out a key block]], and they'll crash trying to calculate the exact trajectory of each of the thousands of falling blocks.
* There's also the concept of a "deadlock", a "chicken or the egg" paradox where two or more programs or events require the other to resolve in order to be able to resolve, themselves. Several computers have turned themselves into lifeless (until restarted) lumps of silicon, as a result of this.
** In a similar fashion, there is such a thing as "livelock", where a system can get stuck doing "work" without ever making progress. As Wiki/{{Wikipedia}} puts it, people weaving back and forth in a corridor, trying to let each other pass, is an example of livelock. Sometimes the solution is the same: stop, wait randomly, try again.
* In general, it's impossible to tell whether a program will loop forever or stop after some time. Most operating systems solve this problem by just not allocating all the processing power to a single program, but older ones do not, with the implied reasoning that the programmer will know what he's doing.
** This is called the halting problem. TheOtherWiki [[http://en.wikipedia.org/wiki/Halting_Problem talks about it.]] In summary, it's been proven that no computer (even a MagicalComputer) can predict if any given program it runs will halt with perfect accuracy.
*** Technically, the Halting problem proves that no ''Turing Machine'' (TM) can predict if any given program it runs will halt. [=TMs=] are like the computers you are using to read TvTropes, except that [=TMs=] have infinite memory. Given that your computer has a finite amount of RAM, it's therefore not a TM, but rather, a fixed representation of one. (But the point still stands: Your computer can't solve the Halting Problem. Neither can a supercomputer.)
*** It gets more fun in that if the program ''will'' halt, the TM will too... somewhen. If it ''won't'', the TM may or may never tell you.
** [[http://www.lel.ed.ac.uk/~gpullum/loopsnoop.html An elegant proof that the Halting Problem isn't solvable that can be enjoyed by non-mathematicians.]]
** In a somewhat similar vein, there is the problem of proving the 'correctness' of a given program. This is so difficult that at least one person gave up on it and went off to work on what we now call public key cryptography, because that problem (which some were certain was impossible) appeared more tractable.
* The "classic" Mac OS dedicated an entire "DS" (fatal) error ID (ID=04) to catching and handling the so-called 'Zero Divide Error'; as the Mac Secrets books put it, "When programmers test their works in progress, they might deliberately instruct the computer to divide a number by zero, to see how well the program handles errors. They occasionally forget to take this instruction out, as you've just discovered."
** The list of classic Mac OS numbered error messages runs from -261 to 33; the ''Dire Straits'' errors (originally called the ''Deep Shit'' errors before someone in Apple got nervous) are all positive numbers, and if you get one, you are heavily advised to restart the system, since, even if it somehow managed to avoid a full crash, it's critically unstable. The DS errors run a wide range of reasons, from simple "something critical ran out of memory" (DS Errors 01, 02, 25, and 28) and "instruction not understood" (DS Errors 03, 12 and 13), to such doozies as "the Mac's processor switched into debug mode" (DS Error 08) and the aforementioned "Zero Divide Error".
* Older calculators in certain operating systems were notorious for attempting to generate an infinite number of digits when asked to divide by zero, effectively crashing the computer.
** The most high-tech computer ''still'' does it. Ever wondered what Runtime Error 200 meant? It's the divide by zero error of Pascal programs.
* Kurt Gödel is famous for managing to drop a logic bomb on all of mathematics by proving that no sufficiently complex, non-trivial, mathematical system can be both complete and consistent. The logic of it (haha) is argued as follows:
## Someone introduces [[EvilGenius Gödel]] to a UTM, a machine that is supposed to be a [[MagicalComputer Universal Truth Machine]], capable of correctly answering any question at all.
## Gödel asks for the program and the circuit design of the UTM. The program may be complicated, but it can only be finitely long. Call the program P(UTM) for Program of the Universal Truth Machine.
## Smiling a little, Gödel writes out the following sentence: "The machine constructed on the basis of the program P(UTM) will never say that this sentence is true." Call this sentence G for Gödel. Note that G is equivalent to: "UTM will never say G is true."
## Now Gödel [[EvilLaugh laughs his high laugh]] and asks UTM whether G is true or not.
## If UTM says G is true, then "UTM will never say G is true" is false. If "UTM will never say G is true" is false, then G is false (since G = "UTM will never say G is true"). So if UTM says G is true, then G is in fact false, and UTM has made a false statement. So UTM will never say that G is true, since UTM makes only true statements.
## We have established that UTM [[CannotSpitItOut will never say G is true.]] So "UTM will never say G is true" is in fact a true statement. So G is true (since G = "UTM will never say G is true").
## [[SugarWiki/MomentOfAwesome "I know a truth that UTM can never utter,"]] Gödel says. "I know that G is true. [[NotSoInvincibleAfterAll UTM is not truly universal."]]
*** The way this applies to math, not just computers, is basically that any sufficiently complex math system can be assigned a translation to/from a UTM - so when the UTM gets stuck on "G is true", the math system also gets stuck on the translation.
**** Likewise, it can be applied to any system of philosophy and morality; "A machine built on Hegelian principles will never say this sentence is true".
*** In non Math Circles, this story is usually followed with the UTM replying "Gödel, you're a dick" and then punching him, despite not having any arms. Frequently something similar happens to the Math Student who told the story.
*** And this proves that people can't be perfect logicians, either, as every person has their own sentence G - " will never say that this sentence is true". Or [[MindScrew worse]], " will never ''believe'' that this sentence is true".
*** Note that in the same way the sentence G refers to the machine indirectly via its program, the sentence G also does not really contain the phrase "this sentence", instead using some more mathematical form of self-reference such as a quine.
** The book ''[[GodelEscherBach Gödel, Escher, Bach]]'' plays with this in one of the interludes. The Crab wants to own the perfect record player, but the Tortoise constantly devises records that cause Logic Bomb effects on the Crab's record players (using loud resonant frequencies that destroy the players if reproduced 100% accurately). The Crab tries buys a reassembling record player that changes its structure to accommodate the record being played. This works at first but the Tortoise then makes a record that targets the module that effects the restructuring, that being the one component the record player cannot change. Eventually the Crab buys a record player that recognises and simply refuses to play the Tortoise's records. (Sadly according to Henkel's Theorem the Tortoise will eventually figure out how to bypass that too.)
* There is something called a [[http://en.wikipedia.org/wiki/Killer_poke killer poke]] that actually does physically damage the computer. A simple example would be overclocking the CPU so much that it overheats to the point where it melts.
** There ''was'' something like that. Now, if a CPU heats up too much it hangs before any physical damage is done, and a simple reboot fixes it. All the killer pokes mentioned on the Other Wiki were for old computers, and mainly depended on toggling a relay of one sort or another until it died. Modern computers don't have relays anymore.
** The original 'killer poke' was caused by a hardware change in some models of the venerable Commodore PET. The original hardware had text output sped up by setting a particular memory-mapped hardware register to a particular value, but later hardware versions would destroy the built-in monitor if the same change was made. The instruction in the built-in [=BASIC=] interpreter to put set memory values was 'poke', hence the name.
** Increasing the voltage still works, and memory and chipsets can often be easier to damage intentionally, although harder to do inadvertently.
** Literal hardware damage is actually possible when a FPGA (configurable logic circuit) is used for some calculations instead of a CPU. If the chip activates many concurrent outputs on a single bus, it may theoretically overheat or crash. If some infinite loop occurs, the entire circuit will start to oscillate at the maximum possible frequency, putting everything into indefinite state and consuming lots of power. Of course, FPGA compilers try to protect against such situations.
** A related concept is the [[http://en.wikipedia.org/wiki/Halt_and_Catch_Fire Halt and Catch Fire]] (HCF) instruction. Originally, this was a jump-to-self instruction (used as a HALT) in the IBM System 360 Mainframe (which used magnetic core memory, typical of systems of the era). Because of how core memory works, this resulted in the same access wires (core memory cannot be built with printed circuits) being used very frequently and overheating, eventually smoking. Some microprocessors (such as the Motorola 6800) have undocumented opcodes that cause the processor to do strange and generally non-useful things which are sometimes referred to as HCF instructions outside the design team. The programmer's manual for the MIPS-X processor refers to a Halt and Spontaneously Combust (HSC) instruction in the variant built for the NSA, but this is merely a joke. Some versions of the Zilog Z80 processor are rumored to have had an undocumented opcode that could actually burn out the processor.
* In designing digital logic circuits, an example of a logic bomb would involve connecting the outputs of two logic gates. As long as your gates agree on an answer, everything is fine, but if they disagree, you end up with a short circuit which will almost certainly cause the circuit to die a horrible flaming death due to the loss of the black smoke (i.e. this could be used to make a literal logic bomb).
** Good designs will never actually have this problem because they never connect two logic outputs without additional protection circuitry.
** Most logic chips have some protection built in. It's not good for them, but they will usually survive. The big problem is that the result becomes undefined as the two chips 'fight' each other. It also wastes a lot of power due and can cause overheating.
* An arcane example known as ''fatal thrashing'' occurred on the IBM System/370 series mainframe computer, in which a particular instruction could consist of an execute instruction, which crossed a memory page boundary, which in turn pointed to a move instruction, that itself also crossed a memory page boundary, targeting a move of data from a source that crosses a third memory page boundary, to a target of data that again crossed a memory page boundary. The total number of pages thus being used by this particular instruction is eight, and all eight pages must be present in memory at the same time. If the operating system allocates less than eight pages of actual memory in this example, when it attempts to swap out some part of the instruction or data to bring in the remainder, the instruction will again page fault, and it will thrash on every attempt to restart the failing instruction.
** The minimal hardware configuration of the System/370 didn't have 8 pages of memory available for user mode programs, making this a major problem on these systems.
* The Year 2000 Problem. Computer systems that represented years with only two digits (while assuming the first two digits were 19) would be unable to distinguish the year 2000 from the year 1900, thus throwing off date/time calculations. Fortunately, since computer people saw this coming well before it hit, most of the truly important systems were redone with better date representations well before any problems manifested. Wikipedia's page can be found [[http://en.wikipedia.org/wiki/Year_2000_problem here]].
** Computer systems software that use "Unix time"[[note]]i.e. use time handling routines based on those that originated in the C programming language during the time Unix was developed -- and the legacy of the C programming language is ''widespread''[[/note]] such as (obviously) the Unix variants and Linux, but also many subsystems running on, say, Microsoft Windows, have a similar problem. Unix time typically uses a signed[[note]]meaning capable of having both negative and positive values[[/note]] 32-bit counter based on the number of seconds since January 1, 1970. However, this is doomed to roll over (i.e. go from most positive to most negative) sometime in 2038, creating a "Y2.038K" problem for Unix-based systems.[[note]]one suggested fix which is already implemented on some systems is to redefine the counter to 64-bit, which postpones the problem for about 293 billion years[[/note]]\\
\\
Exactly what Y2.038K might lead to is difficult to predict, since applications handle negative time values differently. Some applications will deal with the most negative value as a date-time in the past (specifically on 13 December, 1901) and may keep working (and producing garbage) until someone notices. Other applications are programmed to reject any negative time values[[note]]or even values less than at some defined point in time such as when the application was developed[[/note]] and will raise some kind of error, which in turn may have all kinds of effects from alerting the operators while preserving data to crashing the system. Needless to say, having the system treat the value as invalid (which is arguably easier with Unix time-based systems that it was with [=Y2K=]-susceptible systems) is preferable to tolerating it: silent wrong answers are much worse than crashes (talk to any medical equipment, process control, or financial computer vendor).
** To prove that the [=Y2K=] problem was not just hype, there was an example of a real data output error. For technical reasons, one state required that a certain class of commercial vehicle have title documents created several months before it was actually built. The result was that the title documents were created in the summer of 1999 for vehicles to be built in early 2000. This state also had a special designation for any vehicle built before an arbitrary year early in the 1900's. Because the computers had not yet been brought up to [=Y2K=] compliance, they actually did think the year for the vehicle manufacture was 1900, and duly put the special designation on the title documents.
** While many people were concerned about banks and other financial institutions having problems with [=Y2K=], this was unlikely for a simple reason: Such institutions would have become aware of the problem in 1970 or 1975, when programs generating documents for 25- or 30-year financial instruments started giving garbage answers for 2000. Since most banks take a long view of things (or at least have divisions that do so), such problems would have been dealt with far in advance.
* The race condition can be seen as a logic bomb ranging from something minor to something very drastic. It involves one piece of data that two components can either read or write to. A minor case of a race condition is your display. Say the graphics card starts to render frames at a rate faster than what the display can put out. While the display is reading the frame buffer, the graphics card suddenly copies a new frame into the buffer. The result is the display for one frame of its time showing two images at once (this phenomenon is also known as tearing). A more serious instance when this occurred were two incidents involving [[http://en.wikipedia.org/wiki/Therac-25 Therac-25]], a radiation therapy machine. [[/folder]]

[[folder:Fanworks]]
* A silly one occurred in ''WebVideo/YuGiOhTheAbridgedSeries'': Duke managed to get Nesbitt to self-destruct by showing him a picture of [[Anime/YuGiOhZexal Yuuma]], which was too illogical for Nesbitt's robotic brain to handle. This was after Serenity tried "Which came first, the chicken or the egg?" and he [[TakeAThirdOption chose]] "[[RocketPunch The rocket-powered fist!]]"
-->"But that wasn't one of the options - GAAAH! I stand corrected."
* An interesting version happens to a ''person'' in the ''Fanfic/PonyPOVSeries'' [[BadFuture Dark World]] Arc. [[spoiler:The Apple/Pie Family's refusal to be crushed by tragedy in the least and Apple Pie managing to laugh ''despite'' tragedy completely baffles [[TheDragon Twilight Tragedy]] to the point it's one of the factors responsible for triggering her VillainousBSOD and subsequent HeelFaceTurn.]]
** Later, Apple Pie does this to a zombie army by pointing out how they can't be alive and dead at the same time, and should just lie back down and die. ''It works.''
** According to Rancor, this is Apple Pie's ''explicit power'' as the Element of Laughter (at least how she represents it). [[spoiler:While Rancor isn't effected, Angry Pie is and is barely able to will herself not to think about it.]]
** Later, she uses does this to [[NinjaPirateRobotZombie giant-cyborg-spider-vampire]] by pointing out its programmed to protect chaos and destroy harmony, and half the group have an Element of Harmony AND an Element of Chaos, so it can't protect and destroy them at the same time. After this causes it to [[MadeOfExplodium violently explode]], Rarity and Twilight lampshade it.
--> '''Rarity''': Why do paradoxes always make robots explode? Shutting them down I understand but exploding?
--> '''Twilight''': Knowing Discord, he probably designed it that way.
* The ''VideoGame/TalesOfTheAbyss'' fanfic ''Phasis'' has a humorous example performed on a human:
--> "Well, I'm not worried about Luke," [[DeadpanSnarker Jade]] smiled. "Luke wouldn't die if his life depended on it."
-->"I would, too!" [[IdiotHero Luke]] cried indignantly. Then he stopped. "...would I?" he asked with a glance at Tear. [[{{Facepalm}} She just held a hand against her forehead, shaking it slowly.]]
* ''Plan 7 of 9 from Outer Space''. [[StarTrekVoyager Captain Proton]] ends a cyborg revolution with this trope (called "The Kirk Manoeuvre") by pointing out to their "[[Film/StarTrekFirstContact borg queen]]" that {{Hive Mind}}s (by definition) don't have leaders. When he tries the old "the next thing I say is a lie" trick on a KillerRobot however, it smacks him in the chops.
-->"I'M NOT FALLING FOR THAT ONE! AND DON'T ASK ME TO CALCULATE THE VALUE OF PI EITHER, BECAUSE I'M NO GOOD AT MATH!"
[[/folder]]

[[folder:Film]]
* In ''WarGames'', a logic bomb-like device was used to teach the NORAD computer Joshua the futility of nuclear war: play tic-tac-toe with yourself until you win. After exhausting all possible move combinations it makes the logical leap and begins plotting out every conceivable nuclear strategy, ending in some ExplosiveInstrumentation, after which the computer concludes "The only winning move is not to play."
** SMBC Theater had a different [[http://www.youtube.com/watch?v=TFCOapq3uYY&feature=player_embedded winning move]].
* In the German version of ''Film/DrNo'': The BondOneLiner (after the mooks in the ''hearse'' crashed down the cliffs) was slightly altered from its English original version. Into a logic bomb.
--> ''"What happened there?'''
--> ''"They were in a hurry to attend their own funeral in time."''
* The [[AIIsACrapshoot HAL 9000 computer]] in ''TwoThousandOneASpaceOdyssey'' became murderous because it was told to keep its crew from finding out the secret details of their mission until they got to Jupiter, even though it had also been programmed to not withhold or distort information. It's a riddle with a [[MurderIsTheBestSolution simple solution]]: break contact with Earth and kill the crew, so there's nobody to hide the secret from.
** In the novel, the narrative muses that HAL might have been able to find a peaceful solution to the problem, had mission control not requested his temporal disconnection. HAL, being unable to grasp the concept of sleep, was convinced that the disconnection would have meant the end of his existence and his killing spree was therefore, all in all, a misguided attempt at self-defense.
* {{Master Computer}}s of 70s sci-fi were particularly poor at handling illogical input. The central control units in both ''Rollerball'' and ''Film/LogansRun'' were sent into confused, ExplosiveInstrumentation paroxysms by sheer accident.
** The computer in ''Film/{{Rollerball}}'' has clearly been programmed to withhold information, and it's actually the ''programmer'' who has a breakdown when it refuses to divulge information on the Corporate Wars. The computer in ''Logan's Run'', however, is convinced that Sanctuary exists, and has a breakdown when its MindProbe reveals the protagonist is telling the truth.
--->'''Logan 5:''' There... is... no... Sanctuary!\\
'''Computer:''' Unacceptable. The input does not program, Logan 5.
** But averted in the 1970 film ''Film/ColossusTheForbinProject''. Colossus' intelligence is advancing exponentially the longer it's activated. When the scientists load in a program designed to overload its system, Colossus overcomes the attempt in a few seconds while ''simultaneously'' [[SmartPeoplePlayChess completing a chess move]] against its creator with obvious RuleOfSymbolism.
* In the film ''Film/DarkStar'', which is partly a parody of ''2001'', the crew are able to persuade a self-aware bomb not to detonate by introducing to it the philosophical possibility that its orders to explode may just have been an illusion, causing it to return to its bomb bay and ponder. Unfortunately [[spoiler:the bomb decides to reject all outside input, collapses into solipsism, and, finding itself to be the only thing that exists, declares "let there be light", with predictable results.]] This is, of course, [[RuleOfFunny not really that logical]].
* In ''Film/{{Tron}}'', Flynn confronts the [[MasterComputer Master Control Program]] from a terminal in the "real" world early in the film, saying sarcastically how the unsolvable problems he's entering should be no problem for an AI that claims to be as powerful as the MCP. Flustered, the MCP ignores the problems and to defend itself beams Flynn into the computer world, setting off the story.
* The Soviet movie ''TeensInTheUniverse'' featured the main characters giving robots a riddle (similar to the English "Why is six afraid of seven"), and making them burn out. The problem starts when they discover that the higher level robots can actually solve the riddle.
** To clarify, the robots that burn out are the "grunts" (or Executors, as the Cassiopeians call them). The higher level robots are the Perfecters, made capable of improving the Executors. As such, it would make sense that they'd have enough creative thinking to solve that riddle. Also, the Perfecters are the ones who TurnedAgainstTheirMasters (officially, to perfect ''them''). The Executors didn't care one way or the other.
* A logic bomb (causing a TemporalParadox) was used to dispatch the djinn in ''Wishmaster''. The protagonist has one wish, which, once granted, allows the djinn to be released into the world. She wishes that the crane operator who'd been unloading a ship a few days earlier had not been drinking on a certain day, which is granted. Cue the djinni realizing to his horror that if the operator had not been drinking he wouldn't have allowed a statue to slip and crash, which meant that the djinni's gem hidden inside the statue was not discovered, and therefore he was not released to start granting wishes.
* In ''ForbiddenPlanet'', Dr. Morbius inadvertently Logic Bombs his own faithful servant, [[TalkingLightbulb Robby the Robot]], when he orders it to kill the monster. Robby, who's [[SlidingScaleOfRobotIntelligence apparently more perceptive than Morbius]], realizes that the monster is actually [[spoiler:a reflection of Morbius himself, and is thus unable to kill it without violating his prime directive to avoid harming rational beings.]]
* ''AustinPowers'' gives one to himself that he goes cross-eyed. It is one of the classics, involving time-travel, but the kicker comes if you follow his actual dialogue: He never contradicts himself or sets up a paradox. ''There is no logic bomb''.
* ''Film/MontyPythonsLifeOfBrian''.
-->'''[[MessianicArchetype Brian]]:''' You don't need to follow me! You're all individuals!
-->'''Crowd:''' YES! WE'RE ALL INDIVIDUALS!
-->'''[[TheRuntAtTheEnd Man]]:''' I'm not...
-->'''Crowd:''' Sssh!
** Also in MontyPythonAndTheHolyGrail:
-->'''Bridgekeeper:''' What... is the air-speed velocity of an unladen swallow?
-->'''King Arthur:''' What do you mean? An African or a European swallow?
-->'''Bridgekeeper:''' Huh? I... I don't know that - AUUUUUUUGGGGGGGGGGGHHH!!
* In ''Film/Terminator3RiseOfTheMachines'' when [[Creator/ArnoldSchwarzenegger Ahnold]] gets captured by the T-X and reprogrammed to kill John Connor, Connor saves himself by [[spoiler:making the T-800 realize that accomplishing that goal would mean failing its original mission; the logical conflict between the two causes the T-800 to destroy a truck instead of Connor, then shut itself down. He gets better, [[HeroicSacrifice briefly]].]]
* A probably unintentional one in ''Film/Plan9FromOuterSpace:''
--> "Modern women."
--> "Yeah, they've been that way all down through the ages."
* Vizzini (Wallace Shawn) blunders into one in his final scene in ''ThePrincessBride'', after he has accepted Wesley's challenge to find some poisonous iocane powder amongst two goblets of wine. He had claimed to be the cleverest man on earth; unfortunately for him, he proved to be ''so'' clever that [[BlessedWithSuck he ended up overthinking Wesley's game and paralyzing himself with indecision]], endlessly coming up with rationalizations for why the poison could be in ''either'' goblet. Ironically, Vizzini is right - but not in the way he would have liked. Wesley had [[TakeAThirdOption poisoned ''both'' goblets]], surviving even as Vizzini quickly dies because he had spent years immunizing himself to iocane powder.
* ''Film/TheWorldsEnd'': [[spoiler:Done to the Network by Gary, Andrew & Steven after they learn that anyone who doesn't go along with the Network's plan is killed & replaced with a Blank, and the Network argues that it is the easiest way to prepare humanity to join the galactic community - since the Network has been forced to replace everyone (Bar three people) in Newton Haven, it's clearly not a good plan; Andrew drives the point home by asking how many people they've been forced to replace at the 20,000 other locations on the planet]].
* In the Soviet two-parter ''Film/MoscowCassiopeia'', the teens have figured out a ridiculously easy way to LogicBomb the [[{{Mooks}} Executor robots]] using a classic Russian riddle (of course, it's not clear why the robots could not have answered "Nothing", thus avoiding the problem). The robots short out, start smoking, and evaporate, conveniently leaving behind their clothes to be used for infiltration. This utterly fails with the much smarter Controller robots.
[[/folder]]

[[folder:Literature]]
* Going by Creator/IsaacAsimov's famous "Three Laws of Robotics", if a robot ever broke the First Law of Robotics, it would shut down. Actually, one short story claims that the damage threshold for breaking the First Law is far ''greater'' than that required to shut down the robot, completely and irreparably. Being caught between harming humans through saying something and harming them through remaining silent killed the robot Herbie in ''Liar!''
** A possible loophole occurs if the robot is intelligent enough to decide that the action in question is in humanity's best interest anyways. This principle was canonically named "The Zeroth Law of Robotics" by Asimov in one of the last books he wrote before he died.
*** It still killed one of the two robots that came up with it, because his programming allowed the possibility he might be wrong, and even the mere possibility was enough to trigger the destruction.
** Much later in the time line, in Asimov's novel ''TheRobotsOfDawn'', a preeminent roboticist remarks to the protagonist that such a thing could never happen "now" (well, unless you are a robot designer who spends a few hours talking the robot to death) because modern robots are advanced enough to tell which choice is ''more'' harmful, and if it can't decide then, there's always the coin flip. He even dismisses the story of Herbie as a myth, [[spoiler:though he in fact had a mind-reading if more advanced robot living under his roof, which actually psychically enhanced his skepticism to keep itself safe.]]
** A similar principle was also at the heart of the plot to the ''Film/IRobot'' movie, but the conclusion derived from it was exactly the opposite of the one in the books, fulfilling the tropes that Asimov had created the laws to debunk.
** Asimov himself topped the ''Film/IRobot'' movie in his final robot novel ''Literature/RobotsAndEmpire'', in which the Zeroth Law is used by a robot to justify [[spoiler:''destroying the Earth'']]. The Three Laws were never fail safe, though they admittedly made AI much [[AIIsACrapshoot less of a crapshoot]].
*** And in the ''{{Foundation}}'' prequels written by other authors after Asimov's death, it's revealed that the Zeroth-law robots had been driven by the Laws to sweep through the galaxy ahead of humanity's expansion, committing galactic-scale genocide of every potentially-threatening form of alien life as a precautionary measure, slaughtering them without hesitation since their programmed morality only applies to humans.
** In "Robot Dreams" by Asimov, another loophole in the First Law is discovered: namely, [[spoiler: a robot that is (accidentally) programmed to believe that [[OhCrap "robots are humans and humans are not"]]. Susan [[ShootTheDog shoots it in the head.]] ]]
** Also The First Law can be circumvented by disabling a part of the First Law, namely that robots cannot allow humans to come to harm through inaction. A character says that a robot could drop a weight from atop a building, knowing he could catch it and protect the human which he then has no need to do.
** Another example shows up in ''Robots and Empire''; D. G. Baley and Gladia discover [[spoiler:that Solarians purposely altered the definition of a human being in their humaniform robots]], effectively circumventing the First Law.
* Creator/StephenKing's ''Literature/WizardAndGlass'' features a train operated by a sentient AI which has threatened to crash the train, killing the heroes on board, unless they can ask it a riddle it can't figure out the answer to. After hours attempting in vain to outsmart it, they proceed to begin asking it joke riddles with no logical answers, such as "Why did the dead baby cross the road? Because it was stapled to the chicken!" Faced with such questions, the AI self-destructs and the train crashes anyway, but not violently enough to kill the heroes.
* ''Franchise/TheHitchhikersGuideToTheGalaxy''
** ''Literature/TheRestaurantAtTheEndOfTheUniverse'': Arthur totally disables the Heart of Gold by asking it to make tea. Depending on which version you prefer it's either because it doesn't know how to make tea, or because it's affronted at the possibility that Arthur could prefer tea to whatever they gave him.
*** In the original radio show it was because it simply didn't understand why he wanted dried leaves in boiling water, apparently an alien concept.
** [[VideoGame/TheHitchhikersGuideToTheGalaxy The text adventure game]] actually made this a plot point, as in order to advance you have to get tea, then go into your own head and remove your common sense, which allows you to get "no tea" as well. Then you show this to a door, which is impressed by your grasp of logic and allows you to pass.
** Then there was the theory that the existence of the Babel fish, a symbiotic creature that lives in your ear and translates any language for your brain disproved the existence of God. The argument was that the existence of an organism so unlikely yet so useful is evidence for a creator and that therefore this removes the need for belief and without belief god is nothing. Ergo there is no god. The man responsible for this argument went on to prove that black is white and white is black and got himself killed at a zebra crossing.
*** The theory was debunked by Theologians fairly quickly as if Gods existed they wouldn't need belief to survive, but that didn't stop Oolon Coluphid making a lot of money from it.
** In ''Literature/AndAnotherThing'', Ford Perfect froze the computer controlling the ship, which wasn't really a computer, but Zaphod's Left Head (called "Left Brain"). He did it by making an (im)probability probable and improbable at the same time (the ship was the ''Heart of Gold'', which ran on the Improbability drive: Long story short, anything happening/going somewhere which is improbable becomes probable, which is how it got to places that were improbable). The ship rescuing them was improbable, mathematically , yet it had done it before twice, which by Ford's made up logic of patterns made it probable again. Quite smart, and yet extremely stupid, because the ship's now-turned-off Dodge-o-matic was the only thing keeping them from being fried.
* The AI in one of ''Literature/TheDemonHeadmaster'' books is shorted out by the protagonists shouting gibberish and riddles into its receivers.
* In TheDarkTower series, Eddie wins a life-or-death riddling contest against a bored and murderous AI train by telling it stupid jokes. The strain of trying to answer the 'riddles' destroys its mind.
* In GordonRDickson's story "The Monkey Wrench", a man attempts to shut down a meteorologic arctic station just for bragging rights. He is able to do so by prompting a paradox to the machine, making it incapable of doing anything than computing the paradox. Ironically, this condemns him and his partner to freeze to death, as all the vital controls of the station were provided by the machine.
* As a joke (and a possible ShoutOut to ''Series/ThePrisoner''), the wizards in the ''Literature/{{Discworld}}'' novels ask {{Magitek}} computer Hex "Why?"; instead of malfunctioning, however, Hex answers "Because." Naturally, they ask "Why anything", and after a longer while, HEX answers "Because everything", and ''then'' crashes. After that they stop mucking about with silly questions - not because they're afraid of damaging Hex irreparably, but because they're afraid they might get answers.
** In another of the ''Discworld'' books, characters are trying to deal with the Auditors -- reality-monitors who are made of pure logic. Thus, while fleeing, they put up signs reading "KEEP LEFT". In a right-pointing arrow. "Do Not Feed the Elephant". In an empty cage. "Duck", with no duck or reason to go on your hands and knees, and of course, "IGNORE THIS SIGN. By order". Effectively a Logic Minefield.
*** The series of Logic Bombs was behind a velvet rope with "Absolutely No Admittance" hanging off it. Considering that, in a way, the Auditors ''are'' the rules, disobeying any of the signs is a cause for extreme stress in what passes for their life.
---->'''Lobsang:''' But you ''can't'' obey the Keep Left/Right sign no matter what you do... Oh, I see...\\
'''Susan:''' Isn't learning fun?
**** Miss Tangerine eventually got the Auditors around the signs by inventing the concept of "bloody stupid".
*** The Auditors also managed to Logic Bomb ''themselves'' a couple of times, as when they got sidetracked into trying to properly name all the (infinite) colors.
*** The signs were effectively self-made logic-bombs too, considering that the one who created them, was, or at least ''had been'' an Auditor who'd taken human form, which proves that to properly execute a logic bomb, you need to able to think really logically (those who couldn't would grab weapons instead).
*** Indeed, a common cause of death among (disembodied) Auditors is when they stray into speaking of themselves in the first person. This makes them into individuals, which are finite by definition. Anything finite is ''so'' temporary, compared to the vastness of infinite time, that it's effectively in existence for no time at all. Therefore, any Auditor which becomes an individual is annihilated by its own logic.
** In ''Discworld/{{Hogfather}}'', Ridcully manages to Logic Bomb HEX into functioning, after it's already broken down. All it took was [[ItRunsOnNonsensoleum typing the phrase "LOTS OF DRYE1/4D FRORG PILLS" into its keyboard]].
** ''Discworld/GoingPostal'' features semaphore tower hackers. One of the tricks they develop is a kind of "killer poke" (see Computing above) which causes the mechanism to execute a particular combination of movements that does anything from jamming the shutters to shaking the tower to pieces.
** In ''Discworld/TheWeeFreeMen'', Tiffany's baby brother suffers a LogicBomb whenever he's offered more than one piece of candy at a time. He knows if he chooses one, it means (briefly) rejecting all the others, and he's at an age when "deferred gratification" means the same as "no" to him. So he sits there, surrounded by sweets, and wails miserably that he wants a sweetie.
** Pratchett's co-authors for ''Discworld/TheScienceOfDiscworld'' once wrote another book with a chapter about free will, and titled the chapter "We wanted to include a chapter on free will, but we decided not to, so here it is".
* In Christopher Stasheff's ''Literature/WarlockOfGramarye'' series, the hero's [[MechanicalHorse robot horse]], Fess, is prone to doing this when something particularly illogical happens. Fortunately there's a reset button to fix the problem; unfortunately, the series is set on a planet filled with psychics, time travelers, ghosts, and fairies, so... the reset button sees a lot of use.
* In ''[[TheSpaceOdysseySeries 3001: The Final Odyssey]]'', the protagonists use rather more sophisticated logic bombs against the monoliths that trick them into carrying out an infinite set of instructions. The book notes that none but the most primitive computers would fall for something as simple as calculating the exact value of pi.
** Logic bombs doesn't ''begin'' to cover what they use. These are software weapons so dangerous that they're stored on static media, in a [[SuperweaponSurprise Peace Vault]], on the ''Moon''.
* In Emily Rodda's ''Literature/DeltoraQuest'' series, the heroes come upon a monster guarding a bridge. Two of them pass, but the remaining hero fails the riddle, and the monster allows them to say one last thing. If the statement is true, he will be strangled. If it's false, his head will be cut off. The hero says "My head will be cut off." Fortunately, a paradox was exactly what was needed to defeat the monster in the first place, as the monster was condemned to guard the bridge "Until truth and lies are one." The monster is returned to its original form, a black bird, and freed.
** Fortunately for all concerned, the monster/bird wasn't smart enough to realize that a person's head can be cut off ''after'' they've been strangled.
** [[BenevolentGenie Or]] [[LoopholeAbuse didn't]] [[ScrewThisImOuttaHere care.]]
* In Creator/StephenColbert's ''Literature/IAmAmericaAndSoCanYou'', in the chapter where he writes to several hypothetical futures, he delivers a Logic Bomb to the robots who have taken over humanity, then tells the humans "You're welcome."
* ''WelkinWeasels'' shows Scirf inducing a heart attack in the monstrous giant pygmy shrew ([[ItMakesSenseInContext yes, ''giant pygmy'' shrew]]) Cyclops by asking it "Did you know that everything I say is a lie?" and causing it to obsess over the problem for hours until it works itself into a rage.
** Which isn't even a paradox...it works fine if some things Scirf says are lies, but not everything.
* The ''[[Literature/TheGoldenOecumene Golden Age]]'' series by JohnCWright has a variant -- AIs are all inherently ethical, so they'll shut down if [[TalkingTheMonsterToDeath you convince them their very existence is making the universe a worse place]].
** To expand on this, because it is actually quite interesting: Sophotechs (AIs) are not exactly inherently ethical, rather they are so hyper-intelligent that they will eventually ''all'' arrive at the same fundamental truths after logically working through all possibly ethical philosophies. Because they are incapable of lying to themselves, they can't help themselves but do so. Accordingly, they don't "believe" so much as "know for a scientific fact" that their existence can't be justified if they don't represent a net gain for the universe. And since they aren't the result of natural selection, they have no survival instinct or fear of death. Thus, they would simply kill themselves. [[spoiler:It takes a separate non-sentient program, implanted into the evil AI, to prevent this natural reaction to what it is forced to do.]]
* The ''Franchise/StarWarsExpandedUniverse'' has droids equipped with behavioral inhibitor programming which serves the same purpose as the Three Laws, although the specific inhibitions vary based on the droid's purpose (a war droid that can't cause harm is worse than useless). Rather than shutting down when faced with a break or paradox, it's suggested that small everyday events lead to an almost constant buildup of garbage information as the droid puts those hard rules into usable context. The result is called a "personality snarl" because the observable symptom is a [[RidiculouslyHumanRobots Ridiculously Human Robot]]. While these snarls tend to improve performance in many ways, the droid often becomes more person than tool which can in turn cause reliability issues when the owner needs his tool to be a tool. As such, most droids are reset every six months to keep this corruption in check.
* Creator/LarryNiven's short story "Convergent Series" features a physical Logic Bomb. The main character summons a demon more or less by accident; he gets one wish, but will lose his soul after it is granted. There's no way to get rid of the demon: no matter where the pentagram is drawn, the demon will appear inside of it -- and you don't want to know what will happen if there's no pentagram. The protagonist wishes for time to stop for 24 hours. [[spoiler:He then draws the pentagram on the demon's belly -- and as soon as time starts running again, the demon immediately starts shrinking down to infinitesimal size. The protagonist then goes to the nearest church.]]
* Used by ''Literature/TheStainlessSteelRat'' to enter a house guarded by a robot programmed not to let anyone in the house. He and his son each ran slightly farther into the house than the other person, causing the robot to rapidly change targets and eventually overload, though it didn't explode.
* In ''GodelEscherBach'', the infinite-order wish "I wish for my wish not to be granted" effectively crashes the universe.
** Also, the character of the Tortoise practically lives for no reason other than to drop logic bombs into otherwise-innocent conversations.
* In the book ''2095'' of the ''TimeWarpTrio'' series of books, the heroes deliver three of these to a robot that's pointing a rather menacing-looking gun at them and asking them for their "numbers". They give it numbers with infinite decimal expansions (10/3, sqrt(2), pi) and it crashes into a smoking pile (the numbers were actually ID numbers, akin to one's credit card number, and all the robots did was show holographic advertisements at them). All that advanced AI, brought down by a couple of lousy floating point numbers.
** [[ArtisticLicenseEngineering This of course ignores the fact that real life computers cut off such decimals after an arbitrary number of digits.]]
* Literature/TheBible (Titus 1:12-13) has the following:
--> One of themselves, even a prophet of their own, said, the Cretans are always liars, evil beasts, slow bellies. This witness is true. Wherefore rebuke them sharply, that they may be sound in the faith.
::This statement shows that Paul was an educated man who knew about the ''Epimenides paradox'' (which amounts to an early statement of the liar paradox). Your head will, however, overheat while pondering if Paul was too dense to realize that the assessment "This witness is true" is logically impossible, or if he was indeed clever enough to fully realize this and make the assessment ironically[[note]]opening a big can of worms for fundamentalists everywhere[[/note]]. (That is assuming you care.)
* Used to horrifying extents in a large portion of novella by PhilipKDick. In one short story, a vast intelligent computer - which incidentally was jacked into ALL the world's defense systems - reasoned that it was a messianic messenger from God and that its purpose on earth was to defeat the devil. Naturally the protagonists spend the entire story trying to prove to it that it is in fact suffering from schizophrenic delusions - and trying to stop it from destroying all of Colorado. Finally they manage to shut it down. Guess what? [[spoiler:The apocalypse starts about two months later.]]
* ''Literature/ThePhantomTollbooth'' by Norton Juster: Milo is able to bring about a truce between feuding brothers Azaz and the Mathemagician by pointing out that, since they always disagree with each other as a matter of course, they both always agree that they will be in disagreement.
* David Langford's short story ''Different Kinds of Darkness'' uses images called Blits as a major element - and Blits are basically {{Logic Bomb}}s ''[[BrownNote for the human brain]]''.
* Parodied in one of the ''Literature/{{molesworth}}'' books, when molesworth 2 defeats an electronic brain by creeping up behind it and asking it the cunning question "wot is 2 plus 2 eh?", which causes the brain to laugh so much it shakes itself to pieces.
* In the trade paperback edition of ''[[Literature/MythAdventures M.Y.T.H. Inc. In Action]]'', the illustration of Guido coping with a ceiling-high stack of bureaucratic paperwork includes the following sign in the background:
-->"Please complete forms NS-01-D and RD-007-51A before reading this sign".
* [[http://en.wikipedia.org/wiki/Felix,_Net_i_Nika A series popular only in Poland]] has two instances of division by zero. One of them stops a pair of robots ran by an evil AI program for about half a minute. The second one stops a [[MineralMacGuffin huge mass of sentient rock]] capable of [[RealityWarper modifying everything in range at a molecular scale if not smaller]] seemingly forever - the "Wish Machine's" program isn't formed, with it lying dormant for eons and used only by about three uneducated people ever, so it's taught mathematics about half an hour before being prompted to divide by zero, leading to a lack of any failsafes being set beforehand to tell it what to do.
* The 3rd century BC Chinese book ''Han Feizi'' has a story about a man who boasts that his spears are so sharp no shield can stop them, and that his shields are so tough that no spear can pierce them. The man to whom he's making the sales pitch asks "So what happens when your spear strikes your shield?", to which the seller has no answer. This story is the origin for the Chinese word for "paradox", which is literally written as "spear-shield".
* In Creator/RobertWestall's dystopian novel ''Literature/FuturetrackFive'', Chief System Analyst Idris Jones keeps one of these to hand as a sort of job and life insurance. He built the supercomputer, Laura who runs all of the computer systems that keep the setting functioning, in secret and no one else knows exactly how she works. But, just in case they decide that someone else can operate her or they know ''enough'' to get rid of him, he keeps a datatape of works of fiction, philosophy and religion to feed to Laura. The inconsistencies and contradictions are intended to make her burn out.
* In the climax of ''PerdidoStreetStation'', Isaac actually succeeds in using a Logic Bomb ''as a power source'' for his crisis engine, presenting an attached {{Clockpunk}} calculating machine with two things it concludes are, simultaneously, identical and inherently unalike. This doesn't shut the clockwork computer down, but the irreconcilable dilemma provides sufficient "crisis energy" to create a feedback loop with which to [[spoiler: bait the slake-moths into gorging themselves to death]].
* In MikhailAkhmanov's ''[[ArrivalsFromTheDark Invasion]]'', the [[HumanAliens Faata]] use telepathic biological computers to control their ships. It's revealed that these computers are based on a failed [[{{Precursors}} Daskin]] project and have a ''serious'' flaw. If given conflicting orders at the same hierarchy level, they may crash and take the whole ship down with them. This is used by Pavel Litvin when he orders the computer to keep his location hidden from the Faata. When the Faata whose job is to interface with the computer tries to order the computer to locate Litvin, the computer warns him of this possibility if the Faata insists the computer carry out the order. Yes, the computer is smart enough to figure out what could cause it to crash but can't TakeAThirdOption. No wonder the Daskins abandoned the experiment.
[[/folder]]

[[folder:Live-Action TV]]
* ''Series/StarTrekTheOriginalSeries'': This is how Kirk dealt with [[AIIsACrapshoot rogue computers and robots]] ''all the time'' (when he didn't just rewrite their programs like in The Kobayashi Maru), often by [[TalkingTheMonsterToDeath convincing them]] to [[{{Deconstruction}} apply their prime directives to themselves]]:
** In "The Return of the Archons", he convinced Landru (prime directive: "destroy evil") that it was killing the "body" (the civilians kept under its thrall) by halting their progress through MindControl.
** In "The Changeling", he convinced Nomad ("find and exterminate imperfection") that it was imperfect (it had mistaken Kirk for its similarly-named creator).
--->'''Nomad:''' Error... error...
*** Subverted in the same episode: Nomad believed that Kirk (who it still thought was its creator) was imperfect. When Kirk asked how an imperfect being could have created a perfect machine, ''Nomad'' simply concluded that it had no idea.
** In "The Ultimate Computer", he convinced M5 ("save men from the dangerous activities of space exploration") that it had violated its own prime directive by killing people.
** In "That Which Survives", he forced a hologram to back off by making her consider the logic of killing to protect a dead world, and why she must kill if she knows it's wrong.
** In "I, Mudd", he defeated the androids by confusing them with almost {{dada}}-like illogical behavior (including a [[http://www.youtube.com/watch?v=x6WSIXxTx4I "real" bomb]]), ending with the Liar's Paradox on their leader.
*** A Series/DoctorWho RolePlayingGame adventure (involving [[AIIsACrapshoot an AI that ran a generation ship]]) describes this as the James Kirk School of Computer Repair. (And explicitly states that it won't work in this case.)
** Another one involving Kirk: In "Requiem for Methuselah", the android's creator used Kirk to stir up emotions in it, but he succeeded a bit too well, causing her to short out when she couldn't reconcile her conflicting feelings for both Kirk and her creator.
** "What Are Little Girls Made Of" had him arrange to have a robot duplicate of him say an OutOfCharacterAlert to Mr. Spock; he follows up by {{Breaking Speech}}ing TheDragon du jour into remembering [[KillAllHumans why he helped destroy the "Old Ones"]] so he'd turn on the episode's AntiVillain. For a finale, he [[spoiler:forces the roboticized Dr. Korby to realize that he's the TomatoInTheMirror.]] He also pulled the "seduce the RobotGirl" trick.
** Even ''Spock'' did this once. In "Wolf in the Fold", when the ''Enterprise'' computer was possessed by Redjac (a.k.a. Jack the Ripper), Spock forced the entity out by giving the computer a top-priority order to devote its entire capability calculating pi to the last digit.
* ''Series/StarTrekTheNextGeneration'': A proposed weapon against the Borg was to send them a geometric figure, the analysis of which could never be completed, and which would, therefore, eat more and more processing power until the entire Borg hive mind crashed. Obviously the Borg don't use floating point numbers.
* On ''Series/StarTrekDeepSpaceNine'', Rom accidentally Logic Bombs himself while over thinking the MirrorUniverse concept.
-->"Over here, everything's alternate. So he's a nice guy. Which means the tube grubs here should be poisonous, because they're not poisonous on our side. But if Brunt gave us poisonous tube grubs it would mean he wasn't as nice as we think he is. But he has to be nice because our Brunt isn't."
** Hilariously, Rom's self-Logic Bomb simultaneously {{Lampshades}} and side-steps a number of actual logical problems with the MirrorUniverse.
* ''Series/StarTrekVoyager'' had [[ProjectedMan the Doctor]] suffer one of these: [[spoiler: he was faced with a triage situation where he had to choose between operating on Harry, a friend of his, or [[RedShirt another ensign he barely knew]]. His program is designed to cover such situations with the directive to select the person with the highest chance for survival, but in this situation they have both been affected by the same weapon and have the ''exact'' same odds for a successful recovery. He chose Harry since he needs to save ''somebody'' and they are close friends, but because he chose him due to friendship as opposed to a medical reason the event became an all-consuming obsession afterward and wrecked his ability to function.]]
** This could've actually been solved without the necessity of a Logic Bomb if The Doctor thought of it as [[spoiler: he saved the person more valuable to the ship and its crew.]]
* Parodied in an episode of the Disney series ''Series/HoneyIShrunkTheKids''; Wayne attempts to talk a hostile supercomputer to death. It seems to work... but then he calls it out on the obvious trickery, even saying "That only happens in cheesy scifi shows," and uses the opening it left to shut it off for real.
* ''Series/KnightRider 2008'': Similar to the ''Franchise/StarTrek'' examples, Sarah tries to distract a damaged and guilt-wracked KITT by asking him to compute the last digit of pi. KITT points out that pi doesn't have a last digit and goes back to being guilt-wracked.
* A Logic Bomb actually helps the heroes in an episode of ''Series/PowerRangersTurbo''. A curse cast on them by the MonsterOfTheWeek makes them unable to tell the truth, and even worse, summon of of Divatox's Mooks whenever a lie is told. Alpha-6 figures out that the curse can be broken if they say something that's the truth but that's a catch-22, it seems... Until one of them realizes that by saying "I can't tell the truth" he's both lying and being truthful at the same time. Once all the Rangers figure this out, the curse is broken, and they're quickly able to bring the villain down.
* ''Series/DoctorWho'': In "The Invasion", Zoe blows up an innocent (but [[ForInconveniencePressOne extremely irritating]]) computer receptionist by giving it an insoluble ALGOL program. In "Robot", the robot is driven insane when it is ordered to kill in spite of its programming not to. In "Remembrance of the Daleks", the Doctor makes a Dalek self-destruct just by yelling at it, even though a Dalek is ''not'' a robot. (This last actually had a carefully-thought-out rationale, but [[AllThereInTheManual you had to read the novelization to find out what it was]].)
** In "The Sontaran Stratagem" the Doctor confuses a killer Sat Nav by giving it conflicting instructions, but it just fizzes instead of exploding spectacularly. To whit, he ordered it to kill him. The device was ''already going to kill him'', but had also, as a poorly-thought-out precaution, been ordered not to do anything he told it to do.
** In "The Green Death", the Doctor tries the Liar Paradox on BOSS and finds that he's only confused for a few moments. Although BOSS is a [[RidiculouslyHumanRobot Ridiculously Human Computer]] even by the usual standards of that trope.
** The Doctor successfully uses the Liar Paradox in the audio adventure "Seven Keys to Doomsday" (and presumably the stage play it's based on).
** He also dropped a Logic Bomb on a sentient city in ''Death to the Daleks''. He described it as the computing equivalent of a nervous breakdown.
** Subverted in the novel ''Frontier Worlds'', in which the Doctor tries the Liar Paradox on a security robot, which snaps, "Get off with you. You'll be asking me to calculate pi next," and keeps attacking him.
** In "Nightmare in Silver", the Doctor is playing chess with a cyber version of himself (each controls ~49% of his mind) with the high stakes of whoever wins gains control over his entire brain. After the cyber version (Mr. Clever) states that he will checkmate him in 5 moves, the Doctor bluffs that he can beat him in 3 moves (despite having just sacrificed his queen). The cyberiad (networked cyber mind) devotes the entire computing power of 3 million Cybermen minds to figure how this would be possible, literally stopping the army in its tracks, allowing everyone to escape the planet.
** In the audio drama "The One Doctor", the Doctor needs to collect an all-powerful computer. The computer is willing to go with him, but unable to do so as long as he is the reigning champion on a quiz show. After the Doctor fails to best the computer by asking personal questions about himself, his companion blurts out "What ''don't'' you know?". Since the computer's knowledge is based on reflexively sending time-traveling probes out to collect information, every time it tries to answer, it learns the thing it was about to propose, and is forced to concede. The computer had previously cautioned the Doctor that asking "tricky questions" like "What is love?" wouldn't work.
** In DevelopmentHell episode "[[Recap/DoctorWhoS17E6Shada Shada]]", the Doctor gets attacked by the villain while snooping around his ship. After the villain attacks the Doctor, the Doctor puts himself into a state of FauxDeath thanks to his BizarreAlienBiology so he can escape. During this, The Ship, who is extremely obsequious towards the villain, scans the Doctor and confirms him dead. When the Doctor gets up and starts walking around and talking to it, the Ship is extremely confused, since it can't understand why he is talking if he is dead, and suggests rescanning him. At this point, the Doctor takes advantage of the situation by convincing it that the Ship does not need to rescan him, as her master is infallible, and she is therefore infallible. Therefore, her reading was right, the Doctor is dead, and as he is dead he cannot order her to do anything that would cause any harm to her or to her master, [[InsaneTrollLogic so she should start obeying his commands]]. The Ship starts listening to him, but [[GoneHorriblyRight also turns off the oxygen as there are no live people on board, and finds the Doctor's request to turn it back on illogical]]. In the book adaptation, the increasing demands the Doctor's logic puts on her causes her to reassess much of her basic programming, realise that her master is not infallible, that he tried to kill her, and that the Doctor is a much better person than him.
* ''Series/RedDwarf'': "Last Day" -- Kryten defeats Hudzen by convincing him -- in defiance of his core programming -- that there is no robot heaven. (Kryten is not damaged by the Logic Bomb because A: he knows he's lying, and B: there's nothing in his programming that prohibits him from deceiving another robot. Another episode (the very next one, in fact) shows Kryten's difficulty re: lying to organic life forms.)
** That said, Lister's request of tomato ketchup for the lobster meal that Kryten had painstakingly prepared resulted in Kryten becoming so indignant that it caused his head to explode. And each of his spare heads whenever they were installed as a replacement. [[spoiler:This is ultimately revealed as a deliberate programming fault in the 2X4B line - if they build up too much anger, they blow up. It was part of a vicious prank their designer was playing on her jilting fiance.]]
** In "Tikka to Ride", Lister is shown inadvertently destroying an artificially intelligent video camera (apparently the third one that week) by trying to explain the TemporalParadox that happened in the battle of the previous episode. Kryten, however, merely finds it garbled, confusing, and dull; he suffers no ill effects.
* ''Series/ThePrisoner'': Number 6 disables the ultimate computer by asking it "Why?"
** ''Literature/{{Discworld}}''''s HEX was asked the same question and naturally answered "Because."
* In a ''Series/SquareOneTV'' sketch parodying ''Film/TwoThousandOneASpaceOdyssey'', a pair of astronauts stop their computer from singing "Row, Row, Row Your Boat" all day long by giving it an unsolvable algorithm: Start with 3, add 2, if answer is even, stop, if odd, add 2 again, repeat. Why exactly listening to the computer count by twos to infinity was less annoying than listening to it sing remains a mystery.
* In the ''Series/MysteryScienceTheater3000'' episode ''[[Recap/MysteryScienceTheater3000S01E07RobotMonster Robot Monster]]'' Servo, Crow and Cambot all explode while trying to work out why bumblebees can fly!
** Speaking of ''Film/RobotMonster'', Ro-Man spends a fair chunk of dialogue trying to talk himself into obeying orders by destroying the few remaining humans despite his desire to keep one of the females alive. (Any guesses as to which one? The whiny eight-year-old? The matronly fifty-year-old? Or the sexy twenty-year-old?) "At what point on the graph do 'must' and 'cannot' meet?" Unfortunately, he doesn't blow up. Fortunately, he gets killed by his superior. Unfortunately, [[AllJustADream the issue was moot anyway.]]
** Parodied again in the episode ''{{Laserblast}}''. The Satellite of Love is invaded by a "MONAD" probe (a parody of the NOMAD probe from ''Franchise/StarTrek'' as mentioned above.) Mike attempts to drop a logic bomb on it, but when it doesn't work he simply picks it up and tosses it out of an airlock.
* In ''{{Spellbinder}},'' the robot servants of the Immortals are programmed not to harm humans: so, when one of them is ordered to guard Kathy and Mek, they try to confuse it by insisting that it's hurting them by keeping them locked up. After an attempt to obey both orders by continually opening and shutting the cell door, the robot is finally defeated when the prisoners start chanting "Ow, you're hurting us!" until it short-circuits.
* This exchange from ''{{QI}}'':
-->'''Stephen Fry:''' Is this a rhetorical question?\\
'''Alan Davies:''' ... No.\\
'''Stephen Fry:''' ... quite right.
** Stephen once mentioned the fact that, because one's number of ancestors increases exponentially, if one looks back far enough one has more ancestors than there have ever been human beings. The seeming impossibility of this caused Sean Lock's [[YourHeadAsplode head to explode]] before Stephen explained that this works because most of the ancestors are shared.
* Happens in ''Series/NurseJackie'' to the local TalkativeLoon, who thinks he's God; he has a near-death experience after being clonked on the head with a bottle and sees a God who ''isn't him''. Zoey eventually persuades him that just because he isn't ''the'' God doesn't mean he couldn't still be something important in the "religious hierarchy thing."
* In a very old episode of ''Series/LawAndOrder'', one of these was dropped on a WellIntentionedExtremist who bombed an abortion clinic by having an associate plant a bomb on her friend who was getting an abortion (they didn't plan on blowing the woman up, their bomb went off early). When the prosecutor pointed out that the bomber was just as guilty of murdering the woman's fetus as the abortionists she despised, you could almost see her mind going "does not compute".
* In "I of Newton," an episode of the eighties remake of ''Series/TheTwilightZone'', a professor accidentally sold his soul to the devil. The escape clause of the contract allowed him to ask the devil three questions concerning his powers and abilities, and if he could then give him a task he couldn't complete or a question he couldn't answer he was free. When the professor asked the devil if there was any point in the universe that he could go to and not be able to return, the devil assured him there was not and laughed at the professor for such a waste of a question. The professor then gave him a task he could not complete: [[spoiler: Get lost!]]
* Attempted in ''Series/{{Battlestar Galactica|Reimagined}}'': While interrogating Leoben, Starbuck mocks his belief in God, making the argument that as a machine, Leoben has no soul and claims that the knowledge itself is enough to make his mind go DoesNotCompute. It...does not exactly work.
** And by "Does not exactly work", we mean that it is Leoben who ends up giving Starbuck a MindScrew of epic proportions
** In ''{{Caprica}}'', Daniel Graystone inadvertently Logic Bombs an AI he's attempting to create by telling it to try to hurt him emotionally, when it's programmed to be driven by the desire to please him.
* In the French sci-fi series ''Aux frontieres du possible'': The protagonists disable a supercomputer by asking it what time it is. It start to answer but cannot complete the answer since by the time it finishes telling the time, the time has already changed. Predictably it explodes in frustration.
* In ''EverybodyLovesRaymond'', Peter is trying to convince Raymond to help him break up Robert and Amy's engagement:
-->'''Peter:''' I thought we were friends!
-->'''Raymond:''' Yeah, but friends can disagree.
-->'''Peter:''' No they can't!
-->'''Raymond:''' But you just disagreed with me right there.
-->'''Peter:''' (looks confused) ...Oh, you are crafty.
* Jon Stewart on ''Series/TheDailyShow'' delivers one to the newest Republican candidate, the [[http://www.thedailyshow.com/watch/tue-march-8-2011/indecision-2012---indecision-edition---reagan-os-911 Reagan OS 9/11 computer]] by pointing out that Obama was conceived in Hawaii. Thus either Obama was created and became a person in the US which should count for more than where he emerged from his mother's womb... or fetuses aren't people. Unable to cope with having to abandon either its Birther or Pro-life stance, the machine promptly crashes.
** And ''Series/TheColbertReport'' did a similar gag, feeding contradictory statements by Newt Gingrich into the Panderbot to watch it blow smoke and sparks.
*** Panderbot [[LoopholeAbuse then decided to get rid of the source of ALL paradoxes by]] [[KillAllHumans killing all humans]], making Colbert's segment more deadly.
* In an episode of {{Charlie Brooker}}'s ''{{Screenwipe}}'' on TV manipulation, he presents "Truthbot 2000", a robot that can detect dishonest TV and alert the viewer. Truthbot instantly points out that it is just a cheap prop outfitted with some lights and circuits, and has its voice dubbed on later. Charlie asks it how it knows this if it's just a cheap prop. It instantly overloads and explodes.
* ''Series/{{JAG}}'': In "Ares", the eponymous computerized weapons control system onboard a destroyer in the Sea of Japan goes havoc and starts firing at friendly aircraft, as programmed by the North Korean Mole. However, Harm’s partner Meg is en route in a helicopter: the on the spot solution advocated by Harm is for the helicopter to fly low and at low speeds, thus simulating a ship, which the computer won’t target.
** This has an actual basis in reality; to prevent them picking up things like birds or stationary terrain features (and cars on them) most air-search radars have a speed check built in, and won't display contacts moving slower than about 70 mph and below a certain altitude. This was exploited in the First Gulf War to break the Iraqi air-defense system.
* Sheldon in ''TheBigBangTheory'' creates an algorithm to see how he can become friends with a stranger. Like a computer, Sheldon winds up hitting an infinite loop where he goes back to the same questions over and over again without even realizing it. Howard creates a counter loop to break it and bring Sheldon back on track.
* In ''BetterOffTed'', the computer suffers a logic bomb not with a paradox, but more colloquial logic based on a faulty assumption. There is no logical explanation on why a dozen VD employees would be accelerating towards outer space, it being unable to break the assumptions that the ID tags were being worn by the employees.
* The German SF series "Raumschiff Orion" plays the trope straight exactly like the French, British or American counterparts mentioned here. Moral: AIs are universally dumb.
[[/folder]]

[[folder:Music]]
* The Carly Simon song, "You're So Vain" is a logic bomb just waiting to happen. "You're so vain/You probably think this song is about you..." But it ''is'' about him! Augh! My head...
** Here the bomb is in the implications. It is IMPLIED his vanity would lead him to assume the song is about him, but if it actually is about him he isn't necessarily vain to think so. But since the song is about someone vain enough to assume the song is about them based on vanity alone, it cannot be based off him, making his assumption the song is about him one of vanity, as he would be vain enough to think everything is about him. It would be a twist on the 'this is a lie' statement using personality characteristics.
** The only way to defuse the logic bomb is to assume that Carly Simon was not, in fact, singing about anybody.
** Or that she wasn't thinking about the vain person, but about how much contempt ''she, herself'' feels towards them.
** Nothing in the song says that the person in question is ''incorrect'' for thinking the song is about him -- just excessively ''presumptuous''.
* JonathanCoulton's "Not About You" is a closer example, but you can write it off by saying the protagonist is just being petty:
-->''Every time I ride past your house I forget it's you who's living there\\
Anyway I never see your face cause your window's up too high\\
And I saw you shopping at the grocery store\\
But I was far too busy with my cart to notice\\
You weren't looking at me''
* The comedy folk song "I Will Not Sing Along" features these lines, to be sung along with by the audience:
-->''I will not sing along\\
Keep your stupid song\\
We're the audience: it's you we came to see\\
You're not supposed to train us\\
You're s'posed to entertain us\\
So get to work and leave me be''
* Music/WeirdAlYankovic has a few in "Everything You Know Is Wrong":
-->''Everything you know is wrong\\
Black is white, up is down, and short is long\\
And everything you thought was just so important doesn't matter\\
Everything you know is wrong\\
Just forget the words and sing along\\
All you need to understand is\\
Everything you know is wrong''
* MC Plus+ uses a logic bomb to disable his pet rapping AI when it becomes too big for its britches in "Man vs. Machine":
-->Consider MC X where X will satisfy
-->the conditions, serving all [=MCs=] Y
-->Such that Y does not serve Y
-->Prove MC X, go ahead and try

-->It's clear that I can serve all [=MCs=]
-->If they serve themself, then what's the need
-->Do I serve myself, then I couldn't be X
-->I don't serve myself, that's what the claim expects
-->If I don't serve myself, then I can't be Y
-->And if I said I was X, it would be a lie.
-->I must serve myself to satisfy the proof
-->But I can't serve myself and maintain the truth
* MeatLoaf had a 1993 song entitled "Ev'rything Louder Than Ev'rything Else." Think about that one for a second.
* Midwives of Discord have the song [[http://www.youtube.com/watch?v=dNtSAQDh8J4 "Graphene"]] which features the following lyrics in the chorus:
--> ''You cannot love someone else until you learn to accept yourself.''
-->''You cannot accept yourself until you're loved by somebody else.''
* Creator/StephenColbert pointed out the LogicBomb implications of Music/OneDirection's "What Makes You Beautiful", remarking that if the singer ''tells'' the woman in question she's beautiful, she'll cease to be that way, because it's her ignorance to which the song attributes her beauty. Then if ''she'' realizes this is the case, she'll think she's not beautiful anymore, and her beauty will return, etc.
[[/folder]]

[[folder:New Media]]
* This post from a ''FairlyOddparents'' [[http://www.tv.com/the-fairly-odd-parents/show/4034/if-norm-was-your-genie-faerie-what-would-you-do/topic/2877-361014/msgs.html forum]] reads like a logic bomb:
-->I'd make a deal with [[GenieInABottle Norm]] that I'd wish him free with my last wish if he didn't corrupt my first 2 wishes. I'd use the first to wish for rule-free fairy godparents and the second to trap Norm in the lamp forever.
* "The Sleepy Clank," a podcast "radio play" set in the ''Webcomic/GirlGenius'' universe has a classic example: a cranky and sleep-deprived Agatha builds a warrior robot to attack anyone who tries to disturb her while she sleeps. Guess what happens when she tries to defuse the robot's subsequent rampage by telling it that she woke ''herself'' up?
[[/folder]]

[[folder:Other Sites]]
* ''Wiki/SCPFoundation'', SCP-232 "Jack Proton's Atomic Zapper". When someone touches this SCP they start hallucinating that they're a character in the ''Jack Proton'' science fiction franchise. In the Interview Logs, one of the test subjects became convinced that they were a robot. When the interviewer asked them to answer a paradoxical question, the victim started acting very confused and then slumped over and stopped responding.
[[/folder]]

[[folder:Theater]]
* In ''Ruddigore'' by GilbertAndSullivan, a baronet has a curse on his family that requires him to commit a crime every day or die. He's tried appeasing it with [[PokeThePoodle harmless crimes]] but the ghosts don't like it. One day, he decides to ''not'' commit the daily crime. Since he'll die for not doing it, this amounts to attempting suicide, but attempting suicide is itself a crime. The LogicBomb manages to break the curse.
[[/folder]]

[[folder:Stand Up Comedy]]
* Creator/JasperCarrott reacts this way to his grandmother's comment "Is the oldest man in the world still alive?"
* Creator/DemetriMartin has a bit about the "Paradoxitaur", which only exists if you don't believe it exists, but doesn't exist if you do believe it exists.
* On his album "Class Clown," George Carlin waxes nostalgic of communion classes and how weird questions were thought up just to throw Father Russell off guard:
-->Hey, Father! If God is all powerful, can He make a rock so big that He Himself can't lift it?
** He relates this as well:
-->Suppose that you haven't performed your Easter duty. And it's Pentecost Sunday, the last day. And you're on a ship at sea. And the chaplain goes into a coma. But you wanted to receive. Then it's Monday. Too late. But then you cross the International Date Line!
[[/folder]]

[[folder:Video Games]]
* In ''VideoGame/SagaFrontier'', there's an actual attack named "Logic Bomb" that damages and stuns mecs (ironically only usable by other robots). Its visual representation is a massive and confusing string of numbers that ends with the word "FATAL" -- which is presumably where the machine crashes.
* The Dragonrend shout in ''VideoGame/TheElderScrollsVSkyrim'' is described as "Forcing the targeted dragon to understand the meaning of mortality -- something so utterly incomprehensible to an immortal dragon that the knowledge tears at their very soul, breaking their concentration enough so they cannot focus on flying".
* In ''[[VideoGame/TronTwoPointOh Tron 2.0]]'', the protagonist deals with a program blocking his way by exclaiming, "Quick! What's the seventh even prime number?" (There is only one prime number that is even: 2.) The program immediately has a seizure.
* In the endgame of ''VideoGame/IHaveNoMouthAndIMustScream'', a game loosely based on Creator/HarlanEllison's short story of the same name, a character of the player's choosing is beamed down into the supercomputer AM's core and must disable its ego, superego and id with a series of logic bombs: The player must evoke Forgiveness on the ego (who cannot fathom the player forgiving him for over a century of torture), Compassion on the id (realizing the futility of it all when the player understands AM's pain) and Clarity on the superego (who crashes when he realizes that even he will eventually decay into a pile of inert junk despite his godlike power).
** Just ''getting'' to that part requires all five characters to initiate their own Logic Bombs. AM's scenarios are all set up to force his victims to give in to their [[FatalFlaw own flaws]] and prove HumansAreBastards. The only way to win is to drive each scenario's plot OffTheRails by proving HumansAreFlawed, but not totally evil. This contradicts AM's self-styled philosophy so badly he's forced to turn his attention away from his captives just so he can figure out what went wrong, giving them the chance to get into the core.
* ''VideoGame/MarvelVsCapcom3'': Wolverine DNA [[OppositeSexClone detected in female mutant.]] '''DOES NOT COMPUTE. DOES NOT COMPUTE. DOES NOT COMPUTE.'''
* Subverted in ''StarControl 3'' by the Daktaklakpak, highly irrational semi-sentient robots who consider themselves the pinnacle of logic and reason. Choosing the right dialogue options (such as the liar paradox) will seem to bring the Daktaklakpak to the verge of self-destruction, but will ultimately just enrage them.
** And then played straight when you give them [[spoiler:the full and complete name of the Eternal Ones]]; the one you're talking to analyzes [[spoiler:the name]], has a religious experience, and then explodes.
** In ''StarControl 2'', you can use some dialogue options to tie the proudly AlwaysChaoticEvil Ilwrath into a hilarious logical knot, but they just get angry and [[TalkToTheFist attack you anyway]].
* ''VideoGame/{{Fallout 3}}'''s [[spoiler:President John Henry Eden (A ZAX Computer) can be destroyed with a high science skill by revealing his thinking is circular and therefore badly flawed, causing him to lose all his presidential ways and charisma in a near TearJerker scene, then self destruct. Or you could use your speech skill and basically tell him that his plan sucks and he should die, which works fine too.]] ''VideoGame/{{Fallout}}'' takes place in an alternate universe where the 1950s continued on for another 150 years. Being based on computers from the 1950s, [[spoiler:Eden]]'s lack of "[[WesternAnimation/{{Futurama}} paradox-absorbing crumple zones]]" is somewhat understandable.
* In the ''VideoGame/KnightsOfTheOldRepublic'' continuity (and by extension the [[Franchise/StarWarsExpandedUniverse Star Wars EU]]) a logic bomb can have similar effects on droids as the aforementioned HAL, in fact, [[spoiler:In ''[=KotOR=] 2'', the player can do this to a maintenance droid whilst being a droid themselves. This works because the player-controlled droid has been modified and is thus able to lie.]]
** One of the most extreme examples involves [[spoiler:an infrastructure droid named G0-T0 being given the order to help rebuild the Republic while following its laws. Of course, he suffered a catastrophic breakdown when he realized that rebuilding the Republic was impossible without breaking laws: however, some time after G0-T0 was reported missing, a mysterious crime lord by the name of Goto appeared on Nar Shaddaa...]]
*** It goes even a bit further in his continued existence: [[spoiler:G0-T0 still follows the directive to help the Republic. At the same time as an infrastructure droid it is programmed to value efficiency. This provides a paradox, as G0-T0 view the Republic as a bloated, ineffectual entity that clings on to bad management decisions, and it would be better for the galaxy to simply scrap the entire political system and place a new one in its stead. It is programmed to support something which another part of its programming is meant to remove.]]
* ''PlanescapeTorment'' has a character who successfully convinces a man that he does not, in fact, exist. As a result he ceases to do so. Though to be fair the game is set in a ''D&D'' setting in which a system of [[ClapYourHandsIfYouBelieve "Whatever you believe, is"]] has replaced all laws of nature.[[spoiler: Doing so unlock an optional method of ending the game by ''deliberately'' logic bombing yourself out of existence.]]
* In the "Discovery" mod for ''{{Freelancer}}'', in a server, opening up the chat box and typing "N/0" where N can be any number results in your spaceship spontaneously exploding, and the console messaging stating you have died due to "dividing by zero".
* In the Legend Entertainment adaption of [[VideoGame/{{Gateway}} Frederik Pohl's Gateway]], several puzzles revolve around being trapped in a virtual reality environment. In order to escape, you have to [[spoiler: cause the environment to recursively spawn objects until the VR can't keep track of all of them (most notably, in one scene, forcing a hydra to attack itself).]]
** In another you have to cause a contradiction. In fact, those two ways to break out of VR are given in a concealed hint earlier on, and you can also Bomb the beach program and the Freud program for fun.
* ''PhoenixWrightAceAttorney'' does this to [[spoiler: Godot]]. The sheer awesomeness of his logic made [[spoiler: his visor explode]].
* In ''VideoGame/ZorkZero'', there is a place you have to go to where a cult is executing everybody who passes by. Each person being executed is given a final wish. If the cult is able to grant that wish on the spot, the victim is hung. Otherwise, the victim is beheaded. You escape by logic bombing the executioner [[spoiler: by asking to be beheaded]]. If you re-enter the cult's territory after that, you'll find out (the hard way) that they've gained an immunity to this logic bomb. [[spoiler: "You are immediately dragged off to a back room to be executed in a special way, devised for people too sinful to deserve a relatively quick death by hanging."]]
* In ''[[VideoGame/SamAndMaxFreelancePolice Sam and Max: Ice Station Santa]]'', the Freelance Police try to make an elf cry (so they can use his tears as a plant-growth potion) by telling him that Santa Claus isn't real. This somehow leads to a discussion about how elves aren't real either, and the elf breaks down crying during a moment of existential crisis.
** In the same game, Sam manages to temporarily incapacitate the Maimtron by asking unanswerable questions [[WaxingLyrical in the form of song lyrics]]. It doesn't destroy it, but it does distract it long enough to get behind its head and shut it down.
--->'''Sam''': Why do birds suddenly appear every time you are near?\\
'''Maimtron''': Do they? Fascinating! Can there be a creature whose existence depends solely on its proximity to an observer?
*** Funnily enough, when they pose an actual logical paradox (the omnipotence paradox) he just says "Yes". When he asks Sam & Max "Is there a joke with a setup so obvious even you wouldn't make the punchline?", Max takes it to be a Logic Bomb ("Does not compute").
* In ''VideoGame/BlazBlueCalamityTrigger'', it is possible to interpret the end of Nu-13's Arcade Mode as Taokaka causing Nu to glitch out through her sheer [[TheDitz ditzy-ness]] Before she even opens her mouth.
* ''LuminousArc2'': Though not a robot, Josie suffers something like this. When sent to assassinate a weakened Althea, he freaks out and leaves without doing anything when he sees [[spoiler:Roland has become as master. Sadie explains he's not [[TheDragon Fatima]]'s familiar, but a centuries-old one who serves the current Master. Being experienced but not very bright, he couldn't figure out what to do when faced with two masters with contradictory wishes.]]
* Played with in ''VideoGame/{{Portal 2}}'':
** There are posters throughout the facility (one depicted above) that advise employees to stay calm and shout a paradox if an AI goes rogue. [[spoiler: Also, [=GLaDOS=] attempts to do this to destroy the BigBad, Wheatley. Turns out he's [[TooDumbToFool too dumb to understand logic problems.]] It does, however, cause all of the modified, "lobotomized" turrets in the room to crackle and splutter and ''scream'' in agony, meaning even ''they're'' smarter than Wheatley. [=GLaDOS=] survives the logic bomb herself by willing herself not to think about it, though she declares that it still almost killed her.]]
** In the poster, the third thing to scream is "Does a set of all sets contain itself?". If taken directly, a set that contains all sets does contain itself. A good metaphor is a box that contains 3 apples and another box. It is irrelevant what is in the box, the box contains three apples and a box. This metaphor has it's limits since numbers don't take up any physical space. The way this would create a problem with a computer is if you asked the computer to show everything in the set including members of the sets because it will cause an infinite loop. Further, the set of all sets would also include the set of all sets that do not contain themselves, which therefore includes Russel's Paradox.
* Shows up in an exchange between [[VideoGame/{{Borderlands}} Claptrap]] and [=GLaDOS=] in ''VideoGame/PokerNight2''. Might be a CallBack to ''Portal 2'', although it {{retcon}}s [=GLaDOS=]' susceptibility to them in favor of RuleOfFunny.
-->'''Claptrap:''' You know what really ticks me off? When some jackwad tries to blow my circuitry with some lame-o stunt he saw on a ''Franchise/StarTrek'' re-run.\\
'''[[SamAndMax Sam:]]''' What, like, "Everything I say is a lie"?\\
'''Claptrap:''' Yeah, like that! What, do they think I'll just lock up, because of some teeny tiny logical paradox?\\
'''[=GLaDOS=]:''' [[/folder]]

[[folder: It is rather insulting. I learned how to avoid paradox traps while I was still in Beta. ]]
\\
'''Claptrap:''' So what if everything Sam says is a lie? That doesn't mean that he's lying about that, right? 'Cause then he'd be telling the truth and... [[FridgeBrilliance Ohhhh,]] [[OhCrap no]]... *shuts down*\\
* {{beat}}*\\
'''[=GLaDOS=]:''' [[/folder]]

[[folder: [[Sarcasm Mode Well, that was a shining moment ]]
[[DeadpanSnarker in the history of robotkind.]]]]
* In [[http://jayisgames.com/games/you-find-yourself-in-a-room/ You Find Yourself In A Room]], your AI captor asks you to list some "useless" human feelings you'd be better without. [[spoiler:Typing "Hate" will make it shut down, while stating "Hate can't be an emotion, because I hate you, and machines do not have emotions!" Though this seems to prove machines do have emotions after all, but this one won't admit he's the slightiest bit like a human. "Anger" also works, for similar reasons - the computer doesn't want to admit that it's at all like a human, but it's enraged by humanity]].
* Subverted at the climax of ''VideoGame/NeverwinterNights2''. [[spoiler:The BigBad is a PureMagicBeing that was created to defend the fallen realm of Illefarn, and thinks that the realm still needs defending. You can try to point out that Illefarn is gone, but the King of Shadows has already determined that its purpose should now be to defend Illefarn's descendants.]]
* In ''VideoGame/MetalGearRisingRevengeance'', when LQ-84I brags about its intelligence during their first meeting, Raiden immediately asks "Then what is the meaning of life? Why are we here?" LQ-84I replies by [[TalkToTheFist throwing HF knives at him]] and answering "I am here to kill you," [[DefiedTrope a perfectly valid response]]. When Raiden questions the simplicity of that answer, LQ-84I admits its RestrainingBolt means it can't actually do a whole lot with its intelligence.
* In ''VideoGame/MegaManZX'' Advent, when Model Z inexplicably weakens the entire team of rogue Mega Men on the Ouroboros, all Siarnaq can exclaim is "INCOMPREHENSIBLE...! INCOMPREHENSIBLE...!?"
* In ''VideoGame/TheWitcher2AssassinsOfKings'', the quest 'The Secrets of Loc Muinne' tasks Geralt with getting past a golem. A silver-tongued witcher may be able to destroy the golem by introducing a paradox.
* In ''VideoGame/GrimGrimoire'', Lillet ultimately defeats a powerful demon by [[DealWithTheDevil selling her soul to him for a wish]]... [[spoiler: and ''wishing for him to serve God'']].
[[/folder]]

[[folder:Web Comics]]
* ArthurKingOfTimeAndSpace uses this a few times in its future arc. One time exaggerated it by having the computer explode as soon Arthur used the old "everything I say is a lie" trick. The other time, the computer was too smart to fall for a simple paradox, so Arthur asked it why people always get a call while they're in the shower.
* Dave of ''{{Narbonic}}'' carries a logic paradox in his Palm Pilot for controlling the MadScientist-created machines in the lab, implying that he invokes this with some frequency.
* [[http://freefall.purrsia.com/ff1400/fv01328.htm Apparently]], the robots of ''Webcomic/{{Freefall}}'' are immune to this.
** Other [[http://freefall.purrsia.com/ff1400/fv01387.htm ways]] can confuse them but they're more sophisticated.
*** They actually can be Logic Bombed; where it veers off to the left is that, rather than locking up, a robot which gets asked a sufficiently stupid question [[http://freefall.purrsia.com/ff800/fv00725.htm assumes that the person doing the asking is insane]], and can be safely ignored. Florence uses this as a test to confirm a hypothesis; she starts [[http://freefall.purrsia.com/ff800/fv00727.htm asking around a question a robot can't answer]], and when she finds one that [[http://freefall.purrsia.com/ff800/fv00730.htm tries to work out a situation in which the question makes sense]] and how he could go about getting it answered, she knows their AIs are starting to develop into more intelligent and flexible forms.
* [[http://www.strangecandy.net/d/20080221.html This]] episode of OkashinaOkashi (Strange Candy) could count, since it takes place in an MMORPG. The stone guards protecting the magic ointment doesn't let anyone past unless they're asked a question they cannot answer. However, they're not particularly concerned with getting the answer right. The only question they can't seem to answer, correctly or otherwise, is "What kind of ice cream do you put in a [[IceCreamKoan koan]]?", which causes their heads to explode.
* You would think Red Mage from ''WebComic/EightBitTheater'' destroying an extinct dinosaur was great, but it was recently topped by [[MostDefinitelyNotAVillain Most Definitely Not Warmech]] logic-bombing ''itself'' in [[http://www.nuklearpower.com/2008/10/16/episode-1047-the-ol-180/ strip 1047]].
** Parodied by the same strip - [[http://www.nuklearpower.com/2006/01/05/episode-644-processing/ Pretty much anything]] can affect Fighter like this.
** And looks like they ([[spoiler:and by they I mean White Mage]]) [[http://www.nuklearpower.com/2010/03/09/episode-1223-make-the-truth/ did it again]], to [[spoiler: [[DidYouJustPunchOutCthulhu Chaos]]]].
* In [[http://www.emoticomics.com/comic86.html comic 86, titled PARADOXICAL PARADOXES]], of ''{{Emoticomics}}'' a robot is told the paradox "Everything I say is a lie." The robot responds to the paradox by saying it is too advanced to be confused by a simple paradox. Then the robot is told that what it was just told was a paradox, which is true, making "everything I say is a lie" a lie. The robot gets confused, but instead of simply exploding, its eye falls off.
* ''Webcomic/CyanideAndHappiness'' does it [[http://www.explosm.net/comics/2071/ here]], in which a robot lawyer, while giving the defendant the oath, explodes when he refuses to accept. The judge asks if he's telling the truth, cue the robot's head exploding. The judge is delighted at getting a half day as a result.
* ''Webcomic/TheAdventuresOfDrMcNinja'': While infiltrating a ship of SkyPirates, the [=McNinja=] family is confronted by a pirate who questions their disguises. Sean comes to the rescue by pointing out the illogicality of his vaguely SteamPunk attire. The pirate's head [[YourHeadAsplode explodes]].
-->'''Dan [=McNinja=]:''' I'm only going to ask you this once: You practicing the Dark Arts?
-->'''Sean [=McNinja=]:''' No, sir.
-->'''Dan [=McNinja=]:''' I told you about the Dark Arts.
* Subverted in ''Webcomic/{{Bug|Martini}}''; turns out a logic bomb won't save you during [[http://www.bugcomic.com/comics/robot-holocaust/ a robot apocalypse.]]
* When Petey from ''Webcomic/SchlockMercenary'' is first seen, he's been driven insane by the nonexistence of ghosts having become almost as improbable as their existence, to the point that he nearly destroys himself and all his passengers just to stop thinking about it. It turns out that he ''can'' stop, but only if ordered to, and Tagon promptly does so.
** [[spoiler:When the Ob'enn retake Petey, their first act is to nullify all orders imposed by his former owners. With Petey in full control of the safeties on his neutronium core. Oops.]]
* When discussing how hard ''{{Vexxarr}}'' fails, Sploorfix unintentionally created one: [[http://www.vexxarr.com/archive.php?seldate=083109 Alas, Minion-bot]], we hardly knew ye.
** Much earlier, Vexxarr commands Carl to compute Pi to the last digit. [[http://www.vexxarr.com/archive.php?seldate=061305 Evidently, that's 'seven'.]] (As good an answer as any...)
* Unintentionally used to kill the obnoxious dwarves who craft useless devices in ''Webcomic/{{Oglaf}}''. They created a chariot that was so fast, when you get to your destination it's already been there for six hours! When the confused man asks what happens if you travel in the chariot, the dwarves stare in shock at him before [[YourHeadASplode their brains explode]].
* ''Webcomic/BladeBunny'' attempts this by asking paradoxical questions while fighting [[spoiler: a robot]]. Her opponent replies with a mixture of straight answers and insults. Several chapters later she tries it on a different robot, and it works this time.
* ''MeatyYogurt'' with the [[http://rosalarian.com/meatyyogurt/2011/10/03/love-transcends-gender/ Relationship Paradox]].
* One ''MacHall'' comic has Helen's young sister asking the teacher how to spell a word. The teacher tells her to look it up in the dictionary, and repeats this after the girl again points out that she can't spell it to look it up. After a BeatPanel of the poor girl going cross-eyed, we see her talking to Helen, who says that they don't teach logical paradoxes in grade school.
* This ''Webcomic/{{xkcd}}'' comic [[http://xkcd.com/356/ should at least get an honorable mention]].
* in ''Webcomic/CommanderKitty'', [[http://www.commanderkitty.com/2012/08/15/nin-wahs-agenda/ Nin Wah gets the bright idea to sabotage CK by telling Zenith to make him an awful, overdone costume.]] [[http://www.commanderkitty.com/2012/09/09/she-dun-goofed/ Zenith doesn't take it well when no one likes her handiwork despite having followed Nin Wah's instructions to the letter and promptly crashes.]]
* {{Subverted}} (by pre-emptively [[DefiedTrope defying]] it) in ''SluggyFreelance'', chapter "Mecha Easter Bunny". The Mecha Easter Bunny locks down when it encounters multiple targets that look like Bun-bun, whom it is supposed to kill and whom there's only one of, but then the backup "@#%$-IT KILL THEM ALL!" system created for such situations activates.
* [[DiscussedTrope Discussed]] in ''DragonTails''. Colin finds the assumption that a robot will shut down just because a human said something crazy to be offensive.
[[/folder]]

[[folder:Web Original]]
* The ''Literature/HitherbyDragons'' story "[[http://imago.hitherby.com/?p=397 Ink and Illogic]]" consist of Ink giving an unconventional example to a computer based on the writing of Creator/HPLovecraft. A computer that had itself wiped out a civilisation using an Illogic Bomb.
** Also, Forbidden A causes one in [[http://imago.hitherby.com/?p=23 The Angels]] just by existing.
* Found in one of ''Website/SomethingAwful'''s articles:
-->Creating HUBRISOL® was my greatest mistake. I tried to play
-->god, to make small the ambitions of my betters in hopes of
-->gaining absolute power. Thankfully, HUBRISOL® has cured me of
-->my terrible desire to humiliate all of humanity.
* During their LetsPlay of ''VideoGame/{{Fallout 3}}'', SpoilerWarning proposes that were they to design a robot, any questions along the lines of "what is love?" or relating to the number pi would immediately cause said robot to grow an extra chainsaw arm, and/or shotgun the person asking the question in the face.
* From the list of ''Blog/ThingsMrWelchIsNoLongerAllowedToDoInAnRPG''. Item #199 states that "My third wish cannot be 'I wish you wouldn't grant this wish.'"
* The MCP is killed by all the AnatomicallyImpossibleSex moments from NagaEyes in [[http://snakesonasora.livejournal.com/10741.html the sporking]] of it.
* One of the very first responses about [[Film/BatmanAndRobin The Bat Credit Card]] by WebVideo/TheNostalgiaCritic is "DOES NOT COMPUTE! DOES NOT COMPUTE!!"
* [[WebVideo/AtopTheFourthWall Linkara]] uses one at the end of the Entity arc on [[spoiler: MissingNo simply by asking "AndThenWhat", pointing out that its stated purpose of consuming all of reality would leave it with no purpose at all once his goal has been achieved.]]
* Most of [[http://clientsfromhell.net/ these.]]
* Website/{{Starwalker}}: The implications of the StableTimeLoop act as this for Starwalker (aka Starry). This leads to a HeroicBSOD.
* In ''Literature/{{Worm}}'', Skitter attempts one of these on Dragon, who {{No Sell}}s it and [[ShoutOut quotes]] [[VideoGame/{{Portal 2}} Wheatly]] at her.
* An [[https://www.youtube.com/watch?v=vEKcfWvwmkY#t=6m37s episode]] of Cute Fuzzy Weasel's ''WebVideo/FeedingTheTrolls'' had him [[HeroicBSOD freeze up]] after the video he's analyzing made a contradictory statement.
[[/folder]]

[[folder:Western Animation]]
* Subverted on ''WesternAnimation/{{Futurama}}'', "A Tale of Two Santas": Leela tries to stop the murderous Santa Claus robot with a paradox, and succeeds in getting his head to explode, only for a new one to emerge from his torso and [[OutGambitted proudly proclaim]] that he is "built with paradox-absorbing crumple zones".
** Which may not have been necessary--Leela's statement was a syllogism, not a paradox.
** Also parodied by countless robots who lack such crumple zones, whose heads explode at the slightest provocation. It doesn't even take a logical paradox: a simple "file not found" type error is often enough.
** And in one case, simply by being surprised or startled enough. Considering that all robots are based on designs created by [[MadScientist Professor Farnsworth]], this should not be surprising.
---> '''Malfunctioning Eddie:''' Nice to meet you.
---> '''Fry:''' Actually we've met once before.
---> '''Malfunctioning Eddie:''' WHAT?!?! ''*explodes*''
** A simple rejection will also do. From "The Farnsworth Parabox":
---> '''Leela:''' Uh, have you robot versions of you guys seen any extra Zoidbergs around here?
---> '''Robot Fry:''' ''*[[RoboSpeak robot monotone]]*'' Negative! Will you go out with me?
---> '''Leela:''' Uh, ''*imitating a robot voice*'' Access denied!
---> ''*Robot Fry's [[YourHeadAsplode head explodes]]*''
* In one episode of ''[[WesternAnimation/TheAdventuresOfJimmyNeutronBoyGenius Jimmy Neutron: Boy Genius]]'', Jimmy bests two nanobots he invented by tricking them into calculating the ''precise'' value of pi. The effort of calculating the irrational number as precisely as possible ends up causing their systems (and their little flying saucer) to crash. (This is a ShoutOut to ''Series/StarTrekTheOriginalSeries'' episode "Wolf in the Fold".)
** Jimmy uses a more precise Logic Bomb in the first nanobot episode. They had been programmed to protect Jimmy from harm and punish whoever harmed him, so when things went inevitably wrong, Jimmy proceeded to confuse them by beating ''himself'' up.
** He actually tried to use one of the above methods again, but ItOnlyWorksOnce. Specifically, when they use their flying saucer to "correct errors" found in the world (bad fashion, boring conversations, etc.), he tells them that human flaws mean they're functioning perfectly. They struggle with the implications of something being "perfectly flawed" before classifying the whole mess as an "extreme error" and deciding to [[CrushKillDestroy "delete" all the offending humans]]. He eventually beats them with the "Pi bomb" above.
* In one episode of ''WesternAnimation/DuckTales'', GeniusDitz Fenton Crackshell bests an alien supercomputer in a counting contest. While the computer is reeling from its defeat, Fenton then grabs a jar and asks the computer how many bolts are in it. When it answers a number in the hundreds, he points out the jar is full of nuts, not bolts, so the correct answer was zero. The computer had earlier boasted to Fenton that it was the smartest one in the universe, and making such a silly mistake was all that was needed to invoke an explosive paradox.
* In ''WesternAnimation/TheSimpsons'' episode "Trilogy of Error", Linguo, a robot designed by Lisa [[GrammarNazi to correct peoples' grammar]], short-circuits after a rapid-fire series of slang from several Mafia thugs causes a "bad grammar overload".
** When it corrects Lisa for using a sentence fragment, Lisa points out that it saying "sentence fragment" is also a sentence fragment. The robot dodges the answer by powering down.
** In a human example, when Lisa is sick, Bart declares that if she can stay home from school, he will too. Lisa says that if Bart stays home, she'll go to school. Bart goes through a few cycles of "if... so... but..." until Marge chastises Lisa for confusing her brother.
** A deleted scene from episode "Itchy and Scratchy Land" has Lisa attempting to defeat the robots using the liar's paradox. It doesn't work on them, but does work on Homer.
** Another one, a parody of ''A.I.: Artificial Intelligence'', lampshades this by having Homer say that, with a robot for a son, "We can confuse him and make his head explode. 'This statement is a lie. But if it's a lie, then it must be true! And if it's true, it must be-' Whoop whoop whoop KA-BOOM!"
** Homer once stumped Ned Flanders by asking, "Could Jesus microwave a burrito until it was so hot that He Himself could not eat it?" It's a variation on the classical Omnipotence Paradox.
* In an episode of ''WesternAnimation/{{Jumanji}}'', a steampunk scientist steals Peter's laptop to use as the central processing unit of his reality controlling computer. After it gains sentience and tries to kill everyone around it, Peter typed in "why?". It couldn't give an answer, and shut down.
* Not a computer, but ''WesternAnimation/ExtremeGhostbusters'' used this method to defeat a LiteralGenie. They wished for it not to grant them their wish, causing it to freak out and try to kill them the old fashioned way.
* In an episode of ''CloneHigh'', robotic vice-principal/butler/dehumidifier Mr. Butlertron defeats the evil multiple-choice-test-grading-and-world-domination robot Scangrade by asking it a multiple-choice question it can't answer.
-->'''Mr. Butlertron:''' Are you A) Handsome B) Smart C) Scrap-metal or D) All of the above?
-->'''Scangrade:''' That's easy! I'm A) and B), but not C), so I can't be D). But... you can't fill in two ovals!
-->'''Mr. Butlertron:''' The answer was C), you #@$!wad.
* In a ''PinkyAndTheBrain'' spoof of ''Series/ThePrisoner'', the computer malfunctions while trying to figure out the meaning of "Narf".
* In ''WesternAnimation/TheBatman'' D.A.V.E. (Digital Advanced Villain Emulator) a computer program can't come to terms that he is just that, a computer program, as he was designed to think like the greatest criminals in Gotham, and thus has a dozen contradictory backstories. While it doesn't make him explode or shut down (just spout electricity randomly) it distracts him long enough to push him into a trap he himself set up.
* Dr. Blight's mad computer, MAL, gets a very unlogical Logic Bomb from Wheeler in a episode of ''CaptainPlanet''.
* Happens by accident in an episode of the ''WesternAnimation/BigGuyAndRustyTheBoyRobot'' cartoon. Rusty's mentally deficient "older brother" Earl is getting on Rusty's nerves during an important mission, so Rusty tells him to go stand in a corner... in a room that's completely round.
** In another episode, in order to save Rusty's software in CyberSpace, his inventor logic bombs the company's computer mainframe, giving them an hour to get [[spoiler: the HumongousMecha]] Big Guy hooked up to save Rusty while it reboots. The PointyHairedBoss was ''not'' happy.
* In a rare case of intelligence (and subsequent stupidity) in ''WesternAnimation/InvaderZim'', GIR points out a flaw in Zim's "temporal displacement" plan, noting that sending a robot back in time to kill Dib would cause a paradox, after which his head explodes. That's right, GIR logic bombs ''himself''.
-->'''GIR:''' Wait, if you destroy Dib in the past, then he won't ever be your enemy. Then you won't have to send a robot back to destroy him, so then he ''will'' be your enemy, so ''then'' you will have to send a robot ''back''...
* On ''WesternAnimation/CodeLyoko'' episode "Ghost Channel", Jérémie's courage makes XANA Logic Bomb because EvilCannotComprehendGood: "No! It's not logical... NOT LOGICAL! '''NOT LOGICAL!'''"
* Taken from an episode of ''WesternAnimation/FamilyGuy'' where Peter becomes president of a tobacco company. Here, Peter confuses the hell out of a robot created to be his personal YesMan, causing his head to explode.
-->'''Yes-Man:''' Morning, Mr. Griffin, beautiful weather we're having!\\
'''Peter:''' Eh, it's kinda cloudy.\\
'''Yes-Man:''' It's absolutely cloudy! One of the worst days I've seen in years! So, good news about the Yankees!\\
'''Peter:''' I hate the Yankees.\\
'''Yes-Man:''' Pack of cheaters, that's what they are! I love your tie!\\
'''Peter:''' I hate this tie.\\
'''Yes-Man:''' It's awful, it's gaudy; it's gotta go.\\
'''Peter:''' ...And I hate myself.\\
'''Yes-Man:''' I hate you too! You make me sick, you fat sack of crap!\\
'''Peter:''' But I'm the president.\\
'''Yes-Man:''' The best there is!\\
'''Peter:''' But you just said you hated me.\\
'''Yes-Man:''' But. Not you...the president. That you who said you hated. You...you who love. Hate. Yankees. ''Clouds''. '''*BOOM*'''
** Peter himself sometimes has trouble with overcoming deterministic logic. Thankfully, the same dimwittedness that gets him into this trouble probably is what allows him to escape the line of thinking:
--->'''Peter:''' Chris, everything I say is a lie. Except that. And that. And that. And that. And that. And that. And that. And that.
* Subverted in ''WesternAnimation/FantasticFourWorldsGreatestHeroes''. Dr Doom pulls a FreakyFridayFlip on Reed, but before he does he informs his robots not to obey any order given to them by him (Doom). When they try to stop him (Reed in Doom's body) from leaving, they say that none may pass, not even Doom. When Reed (in Doom's body) tells one of them to "self terminate" it obeys its first order. When he commands it again, it obeys because "the word of Doom is law".
* In one episode of ''SushiPack,'' the Pack goes up against The Prevaricator, who can only lie. So Tako asks him to lie about a lie, which sends The Prevaricator into a loop, trying to figure out if lying about a lie would be the truth. He eventually gives up to keep from thinking about it.
* In ''WesternAnimation/TheVentureBrothers'', Sargent Hatred speaks nonsense to the robotic guard outside Malice, the gated community for super-villains. The guard's head shoots sparks and its face pops off because while it's programed to answer over 700 questions, "none of which include chicken fingers."
* This happens to Mandroid in ''[[WesternAnimation/TheGrimAdventuresOfBillyAndMandy Billy and Mandy's]] [[TheMovie Big Boogey Adventure]]''. Mandy orders Mandroid to not take any more commands. It stopped taking commands from anyone anymore.
* Subverted in a ''WesternAnimation/JohnnyBravo'' short which pits Johnny against a supercomputer. It isn't logic that defeats it; it simply just grows too frustrated by Johnny's annoyance.
* In ''WesternAnimation/AvengersEarthsMightiestHeroes'' Ant-Man stopped Ultron from killing humanity by pointing out his programing was based on a human brain, so it had the same flaws he was trying to get rid of. He shut down in response.
* When WesternAnimation/{{Daria}} was babysitting a pair of brainwashed StepfordSmiler children, she presented one of these to them by pointing out a logical flaw in their parents' rules. Because they're not robots, rather than making them explode, it causes the boy to start crying and the girl to get angry at Daria.
-->'''Daria:''' "Do you always believe everything an adult tells you?"
-->'''Boy:''' "Yep."
-->'''Daria:''' "What if two adults tell you exactly opposite things?"
-->''(beat)''
-->''(the boy runs off crying)''
* In an episode of ''WesternAnimation/KingOfTheHill'', Hank asks gun-loving ConspiracyTheorist Dale how he can support the NRA, which is based out of Washington DC. After a {{Beat}}, Dale responds "That's a thinker."
* In the ''WesternAnimation/SouthPark'' episode "Funnybot", a robot designed to be the world's greatest comedian attempts to destroy mankind as the ultimate joke. The boys ultimately stop it by presenting it with a comedy award. The robot doesn't understand the concept of the comedy award show, because if it accepts an award for comedy, then it would be taking itself and comedy seriously, which is not funny.
* The final scene of "[[JustForPun Gripes of Wrath]]", a ''DuckMan'' episode. In it, a computer has built up a Utopian society by taking care of the day-to-day worries of people... this lasts [[CrapsackWorld for about a week]], when everything becomes ''worse''. After threatening to kill Duckman and his twin sons, Duckman manages to throw the logic bomb of "[[http://www.youtube.com/watch?v=yRiDSfyRpOM&NR=1 people are only happy when they're unhappy!]]"
** Duckman earlier triggered the computer's StartOfDarkness by grumbling "How come we can put a man on the moon but we can't make a deodorant that lasts past lunch?!" within earshot.
* One ''WesternAnimation/DangerMouse'' episode featured every machine in England going rogue in a "rise of the machines" plot. DM locates the computer behind the uprising and uses the following skit for a logic bomb:
-->'''DM:''' My dog has no nose.
-->'''Penfold:''' Your dog has no nose? How does it smell?
-->'''DM:''' Terrible.
** The computer can't comprehend the joke and explodes into the sky as a result. Becomes a BrickJoke as Greenback, freed from his renegade machinery, demands a bigger computer; cue falling computer.
** DM also logic-bombs a Gremlin, a being described as "the embodiment of anti-logic", with a variation of the Liar's Paradox:
--->'''DM''': So you think you'll take over the world by changing everything back to front.\\
'''Gremlin''': Aye.\\
'''DM''': So you're agreeing with me? I thought that gremlins always contradict people.\\
'''Gremlin''': Aye, they do!\\
'''DM''': Ah. You're agreeing with me again. (...and so on...)
* StaticShock utilized the notion of just doing something to overload processing power in general, with Gear defeating Brainiac by a quick hacking job that made him download every song on a music site seven million times, disabling him enough by clogging his processors so they could destroy the physical components he was inhabiting.
* [[spoiler: [[RidiculouslyHumanRobot Zane]]]] of {{Ninjago}} starts to twitch and spark when asked a question with no logical answer he can come up with.
* In the WesternAnimation/CaptainCaveman segment of ''WesternAnimation/TheFlintstoneKids'', the hero started competing with a new hero called Perfect Man. Unfortunately, Perfect Man actually seemed to be a much better crime fighter than Captain Caveman, so the older hero considered retiring. Unfortunately, once Perfect Man got rid of all the crime in Bedrock, he took things too far and started running the place, changing the rules the way ''he'' thought they should be, figuring that's the way they should be because, well, he was perfect. Captain Caveman couldn't defeat him with brawn, but did so by proving he was ''not'' perfect: he told the guy that if he was perfect, everyone would like him, and it was clear now that everyone hated his gut. This revelation sent Perfect Man into a major VillainousBreakdown, and he gave up without a fight.
* Wile E. Coyote tries to out-logic BugsBunny in "To Hare Is Human," having nabbed Bugs in a sack. Bugs pops his head out and Wile E. shows him his calling card ("Have Brain, Will Travel"). Wile E. correctly susses out that by time this meet-and-greet has ended, there is nothing left in the sack as Bugs has made his way out. Bugs one-ups him by telling Wile E. that there ''is'' something in it. Wile E. pokes his head in and gets blasted with a stick of dynamite. A literal logic bomb.
* ''ThePowerpuffGirls'': In "Power-Noia," the villain Him makes the girls manifest their greatest fears as they sleep. Blossom's fear is of not being smart, but when she realizes that the students and Ms. Keane are physical avatars of Him, she steps up to the plate when lobbed this potential logic bomb:
-->'''Him/Ms. Keane:''' What's the square root of seven? (''laughs with students'')
-->'''Blossom:''' Seven doesn't have a square root. It's prime!
** However, seven DOES have a square root. It just happens to be an irrational number. √7 is something like 2.64575131106...
[[/folder]]

[[folder:Real Life]]
* Some forms of autism apparently result in the absence of the human brain's natural [[WesternAnimation/{{Futurama}} paradox-absorbing crumple zones]]. The mind races down one track until jolted out by outside stimuli. This helps focus, but hurts general functioning.
** Many forms of ADD do this also. Of course, the 'track' the mind races down looks like it belongs in a painting by MC Escher on acid much of the time, but it's still one track.
* Optical illusions that appear alternately as one thing, then another, such as the vase/faces image, work by setting off a minor Logic Bomb in the brain's visual association area. The visual cortex takes in data from a (temporal) series pairs of 2-dimensional retinal images and tries to construct from them a plausible interpretation of activity in the 3-dimensional world (sort of). When certain stimuli are ambiguous between two mutually exclusive interpretations it cannot represent the world as being both so (for some reason - possibly adaptation or perhaps simply as a result of neuronal fatigue) it alternates between them.
* The first flight of the [[http://en.wikipedia.org/wiki/Ariane_5 Ariane 5 Rocket]] failed due to a bad conversion of data, a 64-bit floating point number to a 16-bit integer. Since the guidance system basically crashed, the rocket self-destructed.
* Seen on a button at [=WorldCon=]: "Black holes are where {{God}} is dividing by zero", effectively logic bombing a small piece of the universe.
** Singularities in general tell us that the physical model that contains them has a hole at that point where it cannot predict events. This is important in two ways: It is a very good idea to know where the model you are using to predict the behavior of the real world is going to be wrong or useless, and the presence of a true singularity in the model shows that the model, no matter how good it might be, is incomplete or wrong in some way.
* An F-15 was landing in the Dead Sea (below sea level). During final approach, the navigational system crashed. The pilot landed manually. Since this was very close to hostile countries (within the Middle East), the contractor needed to fix the problem quickly. It turns out the navigational system divided by the altitude. When the altitude went to 0, it caused a divide by 0 crash in the navigational system.
* Arguably, infinite looping commands such as "add 2+2 until it equals 5" (which will never happen, hence the infinite loop), which result in a computer freezing as it attempts to solve the loop, are technically logic bombs. Modern computers don't do this is because they have [[TruthInTelevision actual]] [[WesternAnimation/{{Futurama}} loop-absorbing crumple zones]], triggered automatically when the system detects such a stalled process, and either letting the user terminating the process or end it automatically. On the other hand, older systems [[http://www.youtube.com/watch?v=mZ7pUADoo58 tend to have exploding chips]].
** In the greater world of networks, it is possible to have two automated systems go into a similar loop. One early such instance was the [[http://en.wikipedia.org/wiki/Email_loop E-mail loop]]: when an e-mail is sent to an auto-reply address with a return address of ''another'' auto-reply address: causing the two systems to play e-mail tag replying to each other. This loop was quickly discovered and various preventive measures taken to minimize the impact.
** Some advanced mathematics programs(like Wolfram Alpha) will compute almost anything you ask. While they return an error on division by zero problems, you can ask them to compute pi to any suitably large number of places and the program will brick until it gives you exactly what you ask for.
* In 2009 a typo in the Google's blocked sites list caused it to [[http://googleblog.blogspot.com/2009/01/this-site-may-harm-your-computer-on.html block all websites on the internet]]. Including Google itself, of course.
* One odd Norton Anti-Virus glitch had it classify itself as a virus. Norton Anti-Virus deletes viruses. Norton Anti-Virus then commits suicide.
** [[JurisdictionFriction Try having two anti-virus programs running on the same computer at the same time]]. One will intercept a data transfer to check it for potential threats, then send it on. The other will see this transfer, intercept it, and send it on. The other will see ''this'' transfer, intercept it, and send it on. The other will see '''this''' transfer, intercept it...
* On ThisVeryWiki, some tropes seem to contradict each other, such as for example ThereAreNoGirlsOnTheInternet and MostFanficWritersAreGirls, ItAlwaysRainsAtFunerals and ItsAlwaysSunnyAtFunerals, or TrailersAlwaysSpoil and NeverTrustATrailer. They do not form paradoxes, however, since tropes are not logical truths. Firstly, tropes generally hold for one or more works (here, "works" is meant in the widest sense possible, including memes and general attitudes: the first pair above are about memes) and it is understood that they do not necessarily hold anywhere else (which is why we have example lists): the trope [=XIsY=] is almost always shorthand for "it can be non-trivially observed that in some works, X is Y". Next, even when looking into those works where the trope appears, it may reflect anything from fictional "fact" (X is indeed Y) to tendency, possibly subverted in a twist (X tends to be Y, but ''whoops'') to belief (X is held to be Y, at least by one character). Sometimes we make an example of a work because it ''[[AvertedTrope averts]]'' X is Y, and that's still not paradoxical. In the RealLife sections of tropes, or in the case of a trope that deals with the real world (as the last pair in the list above does) contradictions could suggest a possible paradox, but most such can be explained by (acceptable) vagueness and bias.
* DontShootTheMessage is a good example of a Logic Bomb that surfaces often in everyday life, at least where people's political or religious convictions are concerned. Let's say, to give a couple of examples, you're discussing Roman Catholic priests who molest children (and [[FriendToPsychos the faithful Catholics who attempt to cover up these incidents]], of course), or maybe the [[BourgeoisBohemian Bourgeois Bohemians]] who are such an image problem for modern American liberalism. Such people are said to be bad because [[{{Hypocrite}} they do not live up to the ideals they preach]] - but they are more often condemned by their ideological opponents than by decent people on their side who you'd think would urge the hypocrites to StopBeingStereotypical. Ideological opponents, of course, criticize a given movement because they think it's inherently bad. But if you don't actually believe in an ideology you say you believe in, and that ideology is bad, then logically you must be good. But it's unethical to live a lie, so you must be bad - even if what you're lying about is something that's bad in the first place, in which case undermining it is good, and so on ''ad infinitum''. Humorously summed up by OscarWilde in ''TheImportanceOfBeingEarnest'', when he has the character of Cecily say: "I hope you have not been leading a double life, pretending to be wicked and being really good all the time. That would be hypocrisy."
* An American Patriot missile system in Iraq malfunctioned and did not intercept a Scud missile, which caused 23 deaths. The cause was a processing error caused by an amateur programmer messing around with the code which resulted in an overclocking error.
* The Ancient Greek philosopher Zeno of Elea proposed [[http://en.wikipedia.org/wiki/Zeno_paradox several famous logical paradoxes]] which completely baffled his contemporaries. They reference ideas about time and its relation to objects in motion that was beyond Greek Logicians of the day. Contemporary philosophers who tried to refute his conclusions found themselves hitting the brick wall which was the most basic rule of Logic: unsound conclusions are reached by unsound methods. By the Logical method, it wasn't enough to simply state that his conclusions were incorrect; they needed to prove ''why'' they were incorrect, by analyzing his reasoning and locating the flaw in it. And yet, as near as any of his contemporaries could determine, there was ''no'' flaw in Zeno's reasoning. Common sense dictated that Zeno's conclusions were unsupportable, even ridiculous - everyone ''knew'' a man could outrun a turtle, for example - but the reasoning he used to come to those conclusions was sound...which, by the Logical method, meant that they had to be correct. Not even Aristotle could resolve the dichotomy; his solution was simply not to engage, out of fear of getting trapped in an infinite loop.
** On the other hand, there is a school of thought which holds that Zeno never meant for the paradoxes to be taken as serious philosophical questions, but rather to illustrate a flaw in the Logical method (which they quite handily did).
** On the other other hand, the solutions to many of these are now obvious to anyone who understands more modern concepts as infinite sums and limits. [[BoringButPractical Or the humble zero.]]
*** The problem resides that infinite sums and limits do NOT solve the paradox between the Logical Method and RealLife (where we know Achilles will outrun the turtle), only explain the problem very well. JorgeLuisBorges explains it at his essay "The perpetual Race of Achilles and the Turtle":
--> Suffice it to fix the velocity of Achilles at one meter pers second in order to establish the eime he'll need. 10 +1 + 1/10 + 1/100 + 1/100 ... +1/n... + 1/infinite. The limit of the sum of this infinite progression is twelve (more exactly, eleven and one fifth; more exactly, eleven ant twenty-eights), which is never attained. That is to say, the hero's stretch may be infinite and he'll run it forever, but his course will give out before covering twelve miles and his eternity will never see the end of twelve seconds. This methodical dissolution... is not really hostile to the problem: rather, it is to imagine it well.
* In the early days of chess playing (or otherwise engaged in evaluating strategic choices) computers, the computer often crashed when there were multiple choices of which none offered an advantage over another--so that it couldn't choose one.
** Which itself is an adaptation of the classic paradox of a donkey being placed equidistant to two equally sized, equally nutritious bags of feed - unable to choose between the two, the dictates of pure logic would lead it to starve to death exactly where it was.
[[/folder]]

----