
"I lost last time. This means that this time I am totally going to win."
Believing that dice/coins have memory, or that independent events will occur in "streaks". If a coin has just landed on heads four times in a row, surely it's much more likely to get tails this time, to even things out... or alternatively, heads is on a roll and will appear next time, too. See also Random Number God and Artistic Licence Statistics. In fact, if you toss a previously untested coin and (say) heads come up, there's a larger chance to get heads on a second roll, because the coin might be biased, although not very much larger, unless the coin is so warped that the imperfection is clearly visible.
Psychologically, this fallacy tends to come from the fact that the odds to replicate a pattern do go up cumulatively. The odds of rolling 20 on a d20 twice is 1/400, the same as any expected sequence of two numbers. The odds of rolling the first is 1/20, and the odds of rolling the second is also 1/20. The fallacy occurs when someone assumes that once they've rolled two 20's in a row, it's less likely than usual (< 1/20) that they'll get another 20. In reality, once they've rolled two 20's in a row, it's just as likely as ever (1/20) that they'll roll a 20 again. This also, most notably, works the other way around  if they've lost many bets in a row, they aren't any more likely to win the next bet. Psychologically, what you're doing is inventing patterns that fit with the events you observe despite not really being there at all, combined with a big scoop of entitlement.
A similar misinterpretation is that if an event has the odds of 1inn, then you are guaranteed a success if you make n attempts. As an exaggerated example, the probability of a "heads" on an unbiased coin is 1/2, therefore, flipping a coin twice is guaranteed to get at least one "heads." This is not true.
Another factor is that many people confuse "a number of independent events" (where any of a number of permutations will do) with "a series of independent events" (where only one permutation will do). If we flip a coin twice, we have a 50% chance of getting heads and tails in some order (headstails and tailsheads; the other two possibilities are headsheads and tailstails). But if we specify that we want the series to be "headstails", the probability that that particular series will come up is only 25% — the outcome tailsheads no longer fits the criteria. (Of course, any series has the same chance of coming up. You have as much chance of flipping headstails as you do tailsheads, headsheads, or tailstails; namely, 25%.)
Also, stuff really does even out over time. Just not in the way some people might think. Say that you have flipped a coin and you have had 4 heads and 1 tails come up. Heads has come up 80% of the time. Now, you get the "normal" (more common) sequence, where 5 heads and 5 tails come up, bringing a total of 9 heads and 6 tails. You then have only 60% heads, so while this is a smaller number, it didn't exactly "even out."
To explain the above in another way, flip a coin 10 times, and the chances that heads was flipped 4 times or more is 82.81%. Flip it 1000 times, and the chances heads was flipped 400 times or more is 99.99999999%. But even if it was less than 400, the next flip will still be 50\50. This is also the reason why playing a high number of lowstakes games in Casinos increases the chances of the house making money; the house advantage only affects who wins a small percentage of the time, but this advantage "evens out" over the long haul. Unless you're a good card counter, taking advantage of free stuff, or just enjoy playing, you're more likely to be successful with a small number of highstakes events.
Note that Gambler's Fallacy applies only to systems that both have no memory, and are explicitly known to be fair. Drawing cards without replacement (read, deck now has "memory") does alter the probabilities of the next cards drawn, and if you do not explicitly know that the event being tested is fair, you can use things like nheadsinarow to draw conclusions of bias in the system (see Nonexamples and Theater sections below).
Examples:
Comedy
 George Carlin, in "Stuff You Don't Want to Hear":
"And Chuck, 9 out of 10 people who have this operation die on the table. But don't you worry 'bout it, 'cause we've already done 9 people this year, and they're all dead. So you're in the clear, you know what I mean?"
Fan Fiction
 Noted (and exploited) in other people by Guild of Gamblers member EmmanuelleMarie Lapoignard les DeuxEpées in Discworld fic The Black Sheep. She deplores it in others and uses other, subtler, methods of subverting this trope in the Sto Kerrig casino, where lax management has gifted her an inadvertently biased roulette wheel. She's a professional gambler, after all. And a good one who tends to come out on top, one way or the other.
Literature:
 The Guild of Gamblers on the Discworld operates on the rule Never Give A Sucker An Even Break. The Guild exists largely to regulate how far its members can mark cards, what sort of springloaded devices they can keep up their sleeves to insert a fifth Ace, and how deeply billiard balls may be shaved.
LiveAction TV
 In Only Fools and Horses, after beating Boycie at poker, Del Boy offers him double or nothing on the spin of a coin. Boycie's response is "I've beaten you on a spin twice, Del. By the law of averages, you've got to win this time." (The coin isn't fair, as it happens, but Boycie doesn't know that.)
 When we finally meet Nico's father in season two of The Wire, he's sitting in a bar betting on the horses according to his "system", and is currently losing. But he boasts that he's still up $7,000... if you aggregate the last 25 years he's been playing.
Newspaper Comics
Tabletop Games
 Many Warhammer gamers or tabletop roleplayers will tell you that this is absolutely true. Others will perform astounding feats of Mathhammer in midgame and tell you exactly how many units of each side should die in an assault, what is the expected variance, and whether or not the assault makes sense in terms of points of enemy units destroyed versus own losses.
Video Games
 City of Heroes utilises a system called the Streak Breaker. This mechanic in the attack calculations sometimes forces attacks that would normally be misses to instead hit, based on the current number of misses in a row versus your chance to hit. In short, it breaks streaks of misses.
 This one also shows up among players of World of Warcraft, in particular with the rare dragon whelp pets that drop out in the world, with many players assuming the more you kill the whelps, the higher your chance grows of finally getting a drop, while in reality the chance is independent of each past or future kill.
 Because people refuse to accept that improbable does not mean impossible or certain (or simply because the large variation in time required was annoying), World of Warcraft developers actually modified this detail to conform to players' expectations. Your chances for a drop do gradually increase the more you kill.
 This was also implemented because of some "kill this mob and loot this item off them" quests, where the drop rate was not 100%. Some of these quests had unusually low drop rates, and you could spend an hour or more trying to finish a quest. With this change, the chance goes up and up with each "loot", and then resets itself to the default rate with every success.
 In fact, many video games play around with the RNG like this, both because players expect it, and because it's really, really annoying to be on the receiving end of a string of bad luck, unless it's the kind of game where being CrazyPrepared for bad luck is expected of the player.
 A Pokémon has a 1 in 8192 chance of appearing as shiny. Way back then, people thought that this meant 1 Pokemon out of every 8192 was shiny, when in fact, the odds never decreased. It's possible to go your entire career and never see a shiny.
Theatre
 Discussed at length in Rosencrantz & Guildenstern Are Dead, in which Rosencrantz flips a coin 85 times in a row and gets heads every time. Guildenstern suggests that it shouldn't be surprising since each coin has an equal chance of coming up heads or tails. Rosencrantz is not satisfied with this explanation, and neither is Guildenstern. (Technically, Guildenstern is right in that given a fair coin, a series of 85 heads is exactly as probable as any other single series of that length [namely, 1 in 2^85]; however, if a coin should actually land the same 85 times, it's a good reason to believe that such a coin [or flip] is NOT fair.)
Web Animation
 Ultra Fast Pony has a doublyfallacious example. First, Twilight is treating a notremotelyrandom system (namely, hiding from a dangerous killer) as if it were random—and then, within that system, Twilight says a plan that's failed once is therefore due to succeed soon.
Web Comics
Western Animation
 In one episode of The Simpsons, billionaire Mr. Burns needs to go to the hospital for a while and thinks he should get Homer to watch over his mansion while he's gone. When Mr. Smithers points out that this is a bad idea because Homer screwed up everything else Mr. Burns has ever asked him, Burns responds by saying that since Homer failed so many times, he's due for a good performance.
 In another, Krusty loses a lot of money betting against the Harlem Globetrotters, because he figured that "the Generals were due".
Real Life
 The Martingale system works if three conditions are met: the player must have access to infinite reserves of capital, the player must be willing to endure a losing streak of any length, and the house must tolerate a table bet of infinite size. So it doesn't work. Casinos can hire mathematicians too; they'd never allow any system which can beat them, but what they will do is design it to look beatable.
 Now, losing ten times in a row is pretty rare right? Well, it's a 1 in 1024 chance, which means it's likely to happen before you win 1024 times, and almost certain to happen before you win much more than that. And that's assuming 50\50 odds. Basically, it's like the house is buying underpriced lottery tickets from you and you're hoping they don't win. Also, since you're playing so many games to do this, the "evening out" effect of the house advantage makes your chances worse than some other strategies (more information on that near the end of the trope description).
 The most famous example of this fallacy is the posting of roulette history in casinos. It's designed to trick people into falling right to this thought. For example, people might see the last few hits were red and so they bet on black. But in fact, there's still the same chance of landing on either color (18/38, assuming a doublezero layout).
 This assumes that the wheel is fair. In reality, some gamblers were able to win big money by observing the bias of individual wheels (nowadays, analyzing the results with a computer, too), and betting accordingly.
 Others will see the last few numbers being red as a trend that red is "hot" and will continue to bet on red until it cools down again. This extends to numbers and groups of numbers, which some claim are "sleeping" and "waking up."
 On August 18, 1913, one of the roulette wheels at Monte Carlo Casino landed black 26 times in a row. Gamblers lost millions of francs betting against black, because they believed that red was due to come up soon. Because of this incident, the Gambler's Fallacy is alternately known as the Monte Carlo Fallacy.
 Similar to the roulette example above, gamblers apply this logic to slot machines, which is pointed out in the book Smart Slot Strategies. The author points out how gamblers will ignore the fact that a machine is controlled by a random number generator and will assume a machine is "hot" or "cold" based on how it's performing. It's further explained that many players believe they can tell how a machine is programmed by playing the machine around 20 times, despite the random number generator having millions of possibilities. A sample size of 20 plays is far too small to determine any trend in a population that size.
 With that said, individual machines really can be set with different payout tables. Still, 20 plays is too small for trends.
 There is a gambler's saying: If a coin is flipped 10 times in a row and comes up heads each time, the layman will assume tails is "due" and bet on tails. The mathematician will assume each flip is an independent event and not bet either side. The gambler will assume that something weird is going on and will not bet either side with the person flipping the coin—but will bet a third party that the next flip will come up heads.
 This thread on Operation Sports.
Looks like this fallacy but is not:
 When the events are not independent: If you draw 10 red cards from a shuffled deck without replacing them, then the next one really is more likely to be black than red because of the 42 cards remaining, 26 are black but only 16 are red. This sort of situation in the real world is, in fact, hypothesized to be how humans developed the intuitions that lead to this fallacy in the first place.
 It's also how counting cards in Blackjack works, but seriously, between the 6 to 8 deck sabots, constant reshufflings, Rainmanlike mental math needed and pit bosses kicking out players for doing it? Not worth the hassle. Stick to mastering the basic game principles and play a perfect game, it's far easier.
 Casinos have no problem with perfect play. It still awards the house a small advantage, and off that tiny statistical edge casino owners build financial empires. Some casinos give out cards with "perfect play" (called Basic Strategy) on them. The gamblers often then lose even more because they buy into this trope and then don't believe the casino would tell you how to play Blackjack correctly.
 If it has not been established that the trials are fair, then a significant deviation from the expected results could count as evidence that they are biased somehow. If a die rolls a 6 at least 10 times in a row, simple statistics say that the die is extremely likely to be weighted, which means that, your adversary trying to manipulate you notwithstanding, you'd better bet on another 6.
 Recognizing Regression to the Mean. In brief, in a series of random independent events, the most likely outcome is "average." So after a good roll of the dice, for example, expect your next to be average. And after a bad roll, expect your next roll to be average. And after an average roll, expect your next roll to be average. Just don't assume it will be average.
 Recognizing a Poisson Process at work. A simple question: if 21 just came up on a roulette wheel, what spin is most likely to produce the very next 21? Most will naively conclude it is most likely in about thirty eight spins or that all spins have the same odds of producing the next 21. In fact, the correct answer is the very next spin. After a 21 comes up, the odds of a 21 on the very next spin is one in thirty eight. However, for the very next 21 to occur two spins later, first a 21 cannot occur on the first spin. In other words, the odds of a spin turning up the very next 21 is equal to (1/38)*(37/38)^(N1), where N is the number of spins after that first 21. Consequently, this sort of process produces an exponential distribution of results. This applies in any such case where the process is truly random, from the flips of a coin to a gamer's dice. This accounts for the counterintuitive "clumpiness" of real randomness.
 To clarify, the next roll is the mode of the distribution (the outcome with the highest probability) while 38 is the expected value (the outcome with the least variance). If you want to guess exactly when a 21 will be rolled, pick the next roll, but if you want to get as close as possible to when the next 21 is rolled, guess 38. Also, since Poisson is memoryless, you can start counting whenever you want—no matter if you just rolled a 21 or not, the next roll will always be the most like one to yield a 21, and the expected number of rolls before the next 21 will always be 38, regardless of history.

