"I lost last time. This means that this time I am totally going to win."Believing that dice/coins have memory, or that independent events will occur in "streaks". If a coin has just landed on heads four times in a row, surely it's much more likely to get tails this time, to even things out... or alternatively, heads is on a roll and will appear next time, too. See also Random Number God and Artistic Licence Statistics. In fact, if you toss a previously untested coin and (say) heads come up, there's a larger chance to get heads on a second roll, because the coin might be biased, although not very much larger, unless the coin is so warped that the imperfection is clearly visible. Psychologically, this fallacy tends to come from the fact that the odds to replicate a pattern do go up cumulatively. The probability of rolling 20 on a d20 twice is 1/400, the same as any expected sequence of two numbers. The probability of rolling the first is 1/20, and the probability of rolling the second is also 1/20. The fallacy occurs when someone assumes that once they've rolled two 20s in a row, it's less likely than usual (< 1/20) that they'll get another 20. In reality, once they've rolled two 20s in a row, it's just as likely as ever (1/20) that they'll roll a 20 again. This also, most notably, works the other way around - if they've lost many bets in a row, they aren't any more likely to win the next bet. Psychologically, what you're doing is inventing patterns that fit with the events you observe despite not really being there at all, combined with a big scoop of entitlement. A similar misinterpretation is that if an event has a probability of 1-in-n, then you are guaranteed a success if you make n attempts. As an exaggerated example, the probability of a "heads" on an unbiased coin is 1/2, therefore, flipping a coin twice is guaranteed to get at least one "heads." This is not true. Another factor is that many people confuse "a number of independent events" (where any of a number of permutations will do) with "a series of independent events" (where only one permutation will do). If we flip a coin twice, we have a 50% chance of getting heads and tails in some order (heads-tails and tails-heads; the other two possibilities are heads-heads and tails-tails). But if we specify that we want the series to be "heads-tails", the probability that that particular series will come up is only 25% — the outcome tails-heads no longer fits the criteria. (Of course, any series has the same chance of coming up. You have as much chance of flipping heads-tails as you do tails-heads, heads-heads, or tails-tails; namely, 25%.) Also, stuff really does even out over time. Just not in the way some people might think. Say that you have flipped a coin and you have had 4 heads and 1 tails come up. Heads has come up 80% of the time. Now, you get the "normal" (more common) sequence, where 5 heads and 5 tails come up, bringing a total of 9 heads and 6 tails. You then have only 60% heads, so while this is a smaller number, it didn't exactly "even out." To explain the above in another way, flip a coin 10 times, and the chances that heads was flipped 4 times or more is 82.81%. Flip it 1000 times, and the chances heads was flipped 400 times or more is 99.99999999%. But even if it was less than 400, the next flip will still be 50/50. This is also the reason why playing a high number of low-stakes games in Casinos increases the chances of the house making money; the house advantage only affects who wins a small percentage of the time, but this advantage "evens out" over the long haul. Unless you're a good card counter, taking advantage of free stuff, or just enjoy playing, you're more likely to be successful with a small number of high-stakes events. Note that Gambler's Fallacy applies only to systems that both have no memory, and are explicitly known to be fair. Drawing cards without replacement (read, deck now has "memory") does alter the probabilities of the next cards drawn, and if you do not explicitly know that the event being tested is fair, you can use things like n-heads-in-a-row to draw conclusions of bias in the system (see Non-examples and Theater sections below). Compare Sunk Cost Fallacy.
— Engie-tan, Nerf NOW!!
- George Carlin, in "Stuff You Don't Want to Hear":
"And Chuck, 9 out of 10 people who have this operation die on the table. But don't you worry 'bout it, 'cause we've already done 9 people this year, and they're all dead. So you're in the clear, you know what I mean?"
- Noted (and exploited) in other people by Guild of Gamblers member Emmanuelle-Marie Lapoignard les Deux-Epées in Discworld fic The Black Sheep. She deplores it in others and uses other, subtler, methods of subverting this trope in the Sto Kerrig casino, where lax management has gifted her an inadvertently biased roulette wheel. She's a professional gambler, after all. And a good one who tends to come out on top, one way or the other.
- As a teacher at the Assassins' Guild School, Emmanuelle was once tasked with covering a sick colleague's Maths class. Aware that the Assassins are a morally minded Guild who look after the moral welfare of their pupils and thus cannot be seen condoning gambling, she chose to teach Probability Theory through the medium of packs of cards, the racing form in the back pages of the Ankh-Morpork Times, and via a roulette wheel borrowed from the Gamblers' Guild. An educative time was had by all and much probability theory was imparted, including the Gambler's Fallacy.
- The Guild of Gamblers on the Discworld operates on the rule Never Give A Sucker An Even Break. The Guild exists largely to regulate how far its members can mark cards, what sort of spring-loaded devices they can keep up their sleeves to insert a fifth Ace, and how deeply billiard balls may be shaved.
- The narrator in Edgar Allan Poe's "The Mystery of Marie Rogęt" mistakenly believes that this fallacy is right, although finds it hard to convince someone.
Nothing, for example, is more difficult than to convince the merely general reader that the fact of sixes having been thrown twice in succession by a player at dice, is sufficient cause for betting the largest odds that sixes will not be thrown in the third attempt. A suggestion to this effect is usually rejected by the intellect at once. It does not appear that the two throws which have been completed, and which lie now absolutely in the Past, can have influence upon the throw which exists only in the Future. The chance for throwing sixes seems to be precisely as it was at any ordinary time—that is to say, subject only to the influence of the various other throws which may be made by the dice. And this is a reflection which appears so exceedingly obvious that attempts to controvert it are received more frequently with a derisive smile than with anything like respectful attention.
- In Only Fools and Horses, after beating Boycie at poker, Del Boy offers him double or nothing on the spin of a coin. Boycie's response is "I've beaten you on a spin twice, Del. By the law of averages, you've got to win this time." (The coin isn't fair, as it happens, but Boycie doesn't know that.)
- In the Friends episode in which the gang go to Las Vegas, Phoebe notices an old woman who always seems to win a jackpot at a slots machine immediately after Phoebe has left the machine in question. Ross explains that this is a lurker: someone who waits for a slots player to have a losing streak and leave the machine, only to "cash in" on their jackpot. Of course, slots machines are entirely randomized.
- On My Name Is Earl, Earl mentions a Noodle Incident wherein he lost a series of Rock-Paper-Scissors games to a monkey. The monkey threw "Rock" several times, and just when Earl decided to throw "Paper," the monkey threw "Scissors."
- In a Peanuts comic, Lucy uses this fallacy to convince Charlie Brown to try to kick the football again.
- Many Warhammer gamers or tabletop roleplayers will tell you that this is absolutely true. Others will perform astounding feats of Mathhammer in mid-game and tell you exactly how many units of each side should die in an assault, what is the expected variance, and whether or not the assault makes sense in terms of points of enemy units destroyed versus own losses.
- A Pokémon has a 1 in 4096 (before Gen VI, 1 in 8192) chance of appearing as shiny. Given any number of encounters, it's still very possible to never see a shiny, as the shiny odds per random encounter never increase or decrease barring very specific exploits the game barely informs you of.
- Discussed at length in Rosencrantz and Guildenstern Are Dead, in which Rosencrantz flips a coin 85 times in a row and gets heads every time. Guildenstern suggests that it shouldn't be surprising since each coin has an equal chance of coming up heads or tails. Neither Rosencrantz nor Guildenstern is satisfied with this explanation. (Technically, Guildenstern is right in that given a fair coin, a series of 85 heads is exactly as probable as any other single series of that length [namely, 1 in 2^85]; however, if a coin should actually land the same 85 times, it's a good reason to believe that such a coin [or flip] is NOT fair.)
- Ultra Fast Pony has a doubly-fallacious example. First, Twilight is treating a not-remotely-random system (namely, hiding from a dangerous killer) as if it were random—and then, within that system, Twilight says a plan that's failed once is therefore due to succeed soon.
Twilight: Hmm. I've got it. We'll run away and hide! Because it didn't work this time, so according to the laws of heads and tails, it must work the next time!
Fluttershy: I don't think the laws of probability work like that.
- Darths & Droids. The Munchkin gamer Pete, facing a situation where a dice roll of 1 would be disastrous, encourages Annie to use one of his dice: "I've pre-rolled the ones out of it." The Rant explains that, beforehand, Pete had carefully prepared a number of 20-sided dice that had rolled two 1s in a row, and placed them in a special, roll-proof container. Since the chances of rolling three 1s in a row is only 1 in 8000, surely rolling another 1 from these pre-rolled dice is almost impossible, right? A bit later in the comic, one of those pre-rolled dice actually does come up as 1. Pete's reaction?
Pete: Awesome! That die will be even luckier next time!
- The Simpsons:
- In "The Mansion Family", Mr. Burns needs to go to the hospital for a while and thinks he should get Homer to watch over his mansion while he's gone. When Mr. Smithers points out that this is a bad idea because Homer screwed up everything else Mr. Burns has ever asked him, Burns responds by saying that since Homer failed so many times, he's due for a good performance.
- In "Homie the Clown", Krusty loses a lot of money betting against the Harlem Globetrotters, because he figured that "the Generals were due".
- In "Margical History Tour", the retelling of Henry VIII's life shows Henry (Homer) meeting Anne Boleyn (Lindsay Naegle), who touts her track record of bearing sons. So, they marry, but she produces a daughter, And Henry has her beheaded for it.
- The Martingale system works if three conditions are met: the player must have access to infinite reserves of capital, the player must be willing to endure a losing streak of any length, and the house must tolerate a table bet of infinite size. So it doesn't work. Casinos can hire mathematicians too; they'd never allow any system which can beat them, but what they will do is design it to look beatable. Now, losing ten times in a row is pretty rare right? Well, it's a 1 in 1024 chance, which means it's likely to happen before you win 1024 times, and almost certain to happen before you win much more than that. And that's assuming 50/50 odds. Basically, it's like the house is buying under-priced lottery tickets from you and you're hoping they don't win. Also, since you're playing so many games to do this, the "evening out" effect of the house advantage makes your chances worse than some other strategies (more information on that near the end of the trope description).
- The most famous example of this fallacy is the posting of roulette history in casinos. It's designed to trick people into falling right to this thought. For example, people might see the last few hits were red and so they bet on black. But in fact, there's still the same chance of landing on either color (18/38, assuming a double-zero layout).
- This assumes that the wheel is fair. In reality, some gamblers were able to win big money by observing the bias of individual wheels (nowadays, analyzing the results with a computer, too), and betting accordingly.
- Others will see the last few numbers being red as a trend that red is "hot" and will continue to bet on red until it cools down again. This extends to numbers and groups of numbers, which some claim are "sleeping" and "waking up."
- On August 18, 1913, one of the roulette wheels at Monte Carlo Casino landed black 26 times in a row. Gamblers lost millions of francs betting on red, because they believed that red was due to come up soon. Because of this incident, the Gambler's Fallacy is alternately known as the Monte Carlo Fallacy.
- Gamblers apply this logic to slot machines, which is pointed out in the book Smart Slot Strategies. The author points out how gamblers will ignore the fact that a machine is controlled by a random number generator and will assume a machine is "hot" or "cold" based on how it's performing. It's further explained that many players believe they can tell how a machine is programmed by playing the machine around 20 times, despite the random number generator having millions of possibilities. A sample size of 20 plays is far too small to determine any trend in a population that size. With that said, individual machines really can be set with different payout tables. Still, 20 plays is too small for trends.
- There is a gambler's saying: If a coin is flipped 10 times in a row and comes up heads each time, the layman will assume tails is "due" and bet on tails. The mathematician will assume each flip is an independent event and not bet either side. The gambler will assume that something weird is going on and will not bet either side with the person flipping the coin—but will bet a third party that the next flip will come up heads.
- During each of the World Wars, a variant of this appeared as the Shell Hole Fallacy: when enemy artillery were randomly blazing away at the field, some troops (and some commanders!) believed that jumping into a crater a bursting shell had just made would increase their chances of survival, since the odds of a place being hit twice during any giving shelling were relatively low. While the artillery gunners weren't entirely unbiased (since they were trying to distribute their shots so as to hit as many parts of the field as possible in hopes of maximizing the number of targets hit), they really didn't have much control over any of the randomizing factors, so the actual odds that any given shell would strike where one had already struck were pretty much the same as the odds of that place being struck the first time. In any event, jumping into those craters did not perceptibly improve the soldiers' survival rate.
- This thread on Operation Sports, regarding an unusual coin toss streak on Madden NFL. Oddly enough, it was later proven that Madden coin tosses were indeed deterministic.
- Job seekers invoke this when sending out resumes or applications to employers and job postings. They may believe that the more they apply, at least one will have to hire them sooner or later. Hiring though can be reliant on many factors such as the overall economy, the job outlook in an industry or occupation, the number of applicants, one's credentials and so on.
- The above example also applies to creative professionals such as the writer who constantly submits to publishing houses, the musician who sends out demo recordings to many record labels, etc. They may believe that sooner or later, at least one will have to accept out of the many submissions. Again, being accepted, signed on, published, etc. often depends on many factors such as the market, the quality of the applicant's submission, and so on.
Looks like this fallacy but is not:
- When the events are not independent: If you draw 10 red cards from a shuffled deck without replacing them, then the next one really is more likely to be black than red because of the 42 cards remaining, 26 are black but only 16 are red. This sort of situation in the real world is, in fact, hypothesized to be how humans developed the intuitions that lead to this fallacy in the first place.
- If it has not been established that the trials are fair, then a significant deviation from the expected results could count as evidence that they are biased somehow. If a die rolls a 6 at least 10 times in a row, simple statistics say that the die is extremely likely to be weighted, which means that, your adversary trying to manipulate you notwithstanding, you'd better bet on another 6.
- Recognizing Regression to the Mean. In brief, in a series of random independent events, the most likely outcome is "average." So after a good roll of the dice, for example, expect your next to be average. And after a bad roll, expect your next roll to be average. And after an average roll, expect your next roll to be average. Just don't assume it will be average.
- Recognizing a Poisson Process at work. A simple question: if 21 just came up on a roulette wheel, which spin is most likely to produce the next 21? Most will naively conclude it is most likely in about thirty-eight spins or that all spins have the same odds of producing the next 21. In fact, the correct answer is the very next spin. After a 21 comes up, the odds of a 21 on the very next spin is one in thirty-eight. However, for the very next 21 to occur two spins later, first a 21 cannot occur on the first spin. In other words, the odds of a spin turning up the very next 21 is equal to (1/38)*(37/38)^(N-1), where N is the number of spins after that first 21. Consequently, this sort of process produces an exponential distribution of results. This applies in any such case where the process is truly random, from the flips of a coin to a gamer's dice. This accounts for the counter-intuitive "clumpiness" of real randomness. The next roll is the mode of the distribution (the outcome with the highest probability) while 38 is the expected value (the outcome with the least variance). If you want to guess exactly when a 21 will be rolled, pick the next roll, but if you want to get as close as possible to when the next 21 is rolled, guess 38. Also, since Poisson is memory-less, you can start counting whenever you want—no matter if you just rolled a 21 or not, the next roll will always be the most like one to yield a 21, and the expected number of rolls before the next 21 will always be 38, regardless of history.
- A similar fallacy probability theory demonstrates is how expected winnings and odds are not correlated. A gambler can play a game where the cost to play is the expected value of a win, but the probability that he will at least break even diminishes to zero the more times the game is played.
- Generally speaking, many video games have a mechanic that tweaks the RNG, so that long strings of excessively bad (or good) luck are less likely than they would be in a memoryless system. This is partly because players expect this, and partly because such strings are really friggin annoying to deal with (at least, in games that aren't centered around being Crazy-Prepared to deal with them).
- In Normal or lower difficulties XCOM: Enemy Unknown, a shot fired by a soldier after another soldier missing a 50%+ shot is more likely to hit... Because the game cheats in your favor and secretly adds an additional 10% chance to the next shot, and does so until a shot hits. It also reduces alien shots accuracy by 10% after a shot hits, until a shot misses. The game stops cheating in Classic or Impossible difficulty, throwing off veteran Normal players.
- City of Heroes utilises a system called the Streak Breaker. This mechanic in the attack calculations sometimes forces attacks that would normally be misses to instead hit, based on the current number of misses in a row versus your chance to hit. In short, it breaks streaks of misses.
- This one also shows up among players of World of Warcraft, in particular with the rare dragon whelp pets that drop out in the world, with many players assuming the more you kill the whelps, the higher your chance grows of finally getting a drop, while in reality the chance was initially independent of each past or future kill... but because people refuse to accept that improbable does not mean impossible or certain (or simply because the large variation in time required was annoying), World of Warcraft developers actually modified this detail to conform to players' expectations. Your chances for a drop do gradually increase the more you kill. This was also implemented because of some "kill this mob and loot this item off them" quests, where the drop rate was not 100%. Some of these quests had unusually low drop rates, and you could spend an hour or more trying to finish a quest. With this change, the chance goes up and up with each "loot", and then resets itself to the default rate with every success.
- Kingdom of Loathing's "adventure queue" remembers the last 5 combat and non-combat encounters you've had in each area, and if the normal selection method picks one of them, it'll reject it 75% of the time and pick a new one.