Follow TV Tropes


Failsafe Failure

Go To

"You know, I always thought a failsafe system was supposed to be somewhat safe... from failure."
Jack O'Neill, Stargate SG-1

Because you can't spell "Failsafe" without "F-A-I-L".

Thanks to Finagle's Law (or just ignorant writers), on TV a system's failsafe will never work when it's needed the most, nor will it actually be failsafe — usually it'll be quite the opposite, sometimes referred to as "fail deadly". The only reference to an emergency shutdown you'll be likely to hear is a panicked tech yelling "It won't shut down!" as the system runs wild. It's supposed to make the phenomenon of Explosive Instrumentation more plausible, by acknowledging it's not supposed to blow up in your face, but a failure elsewhere of a key safety lockout means it can, and will. It also justifies how something that is supposedly governed by industry-wide standards, regulatory law, and years of engineering refinements could go so horribly wrong in the first place.


What's a failsafe? Well, the world is full of a lot of dangerous machinery and devices. Huge electrical turbines and nuclear reactors, power lines carrying enough juice to light a whole city and pipelines carrying millions of tons of explosive petroleum. Trains speeding down the tracks at 300 km/h, trucks that weigh 40 tons rolling down the freeways, aircraft that weigh more than 400 tons flying over our heads. And that's just the stuff that isn't designed to kill anyone. There's plenty of stockpiled bombs, missiles and such out there too. These could all cause some spectacular collateral damage if they suddenly went out of control.

Thus, in the real world, things that have the potential for very destructive damage not only undergo strict maintenance procedures, but usually have circuit breakers, password protection, arming/firing keys, backups for redundancy, and prominent big bright red emergency handles that can shut the whole system down if pulled. More to the point, they usually have a totally separate set of safety features, designed to trigger automatically when the system's operating parameters get too far outside safe norms, which will (ideally) shut down the whole shebang without making the situation worse than it already is.


Contrary to popular understanding, "fail safe" does not mean "safe from failing", i.e. "failure-proof" — it means that if (when) it fails, it will do so in a way that leaves it safe. When something is described as "fail safe", it means that it has been designed and built so that a critical mechanical failure or operator mistake will cause the system in question to default to its safest possible state, quickly and automatically, without any human intervention. Consider the following: if you're at an intersection where there's a traffic light, and it fails, if it "fails safe" then either it goes dark or all four directions show a red signal. If it showed green in all four directions, that would be a failure to fail safe. For more info, see the Analysis page. Real Life Failsafe Failures are often caused by an improbable and unanticipated conjunction of two or more failure conditions (one of which will often turn out to have never worked in the first place).


Compare the way Hollywood treats personal vehicles when the owner is always Driving Like Crazy or leaving his car in a state of neglect, In Hollywood a decrepit car that endangers its occupants and everyone around it is frequently treated as comedy, and all too often this same insouciance extends to things capable of inflicting serious damage should one lose control of them. There's usually No Plans, No Prototype, No Backup or Lock-out/Tag-out procedures, and the Big Red Button is frequently unguarded.

See also No OSHA Compliance, Override Command, Dead Foot Leadfoot, Inventional Wisdom and Plot-Driven Breakdown. Often invoked in a chain of Disaster Dominoes.


    open/close all folders 

    Anime & Manga 
  • Code Geass:
    • We see one instance of mechanical failure in the standardized Ejection Seat, and that's presumably because it's being hit with microwaves that short out the electronics (and make the pilot pop like a potato in the microwave).
    • In another episode, when Lelouch's mecha is shot down, the ejection seat has a terrible launch vector and thus ends up skipping along the ground at high speeds; it's a wonder he didn't suffer whiplash.
    • In one episode this ends up being a good thing. When Guren is defeated by Suzaku and blown off a Britannian aircraft, its eject fails and it looks like Kallen is going to fall to her death. Reinforcements arrive and are able to send her both repair parts and a flight pack that lets her save herself and get back into the fight, eventually bypassing Suzaku to save Lelouch from the crashing ship. If the eject HAD worked, Kallen would have been launched to safety, but the Guren would have been destroyed and Lelouch would have died since Kallen wouldn't have been able to save him.
    • A perfect aversion in R2 last episode: Gino's mecha is so damaged that it automatically shuts down. We hear Gino begging the engine to move again while he edits the mecha's low-level settings, removing all safety features so the mecha can at least walk and move its arms.
  • Averted in Naruto. Minato and Kushina both implant their chakra in Naruto's seal, so that if he tries to break it they will show up to stop him and repair the seal. This actually works. Both times.
    • Played straight with both of Itachi's failsafes with regard to Sasuke, though. One was to implant a genjutsu in Sasuke that would cause him to immediately use Amaterasu upon seeing Tobi's Sharingan, to prevent Tobi from manipulating Sasuke into joining him. Too bad Tobi had kept his most powerful ability secret from Itachi, allowing him to survive the attack. Tobi then invokes the trope.
      Tobi: It's what you'd call a fail-safe... Although he only got as far as the "fail" part.
    • Itachi's second failsafe was to...somehow...implant a crow with Shisui Uchiha's left eye into Naruto. He set this one to trigger if Naruto ever saw Itachi's Mangekyo Sharingan again, it would trigger and brainwash the possessor of those eyes into unbreakable loyalty to the Leaf Village, under the assumption that Sasuke would take Itachi's eyes and implant them to gain the Eternal Mangekyo Sharingan. Which did indeed happen, but before that something Itachi could never have expected ruined this failsafe: Itachi was revived as a zombie and forced to fight Naruto, and thus the perfect brainwashing hit the wrong target. Itachi ultimately decides this was a really bad plan anyway.
  • Neon Genesis Evangelion abuses this trope. Every second angel attack, someone has to push molly-guarded buttons, smash in the protective glass over a handle, or cut the power. Most of the time, the girl at the controls ends up shouting, "The EVA is rejecting the signal!"
    • The one and only time that the failsafe system does actually work and manages to automatically eject the entry plug, it makes things considerably worse for the pilot. The mechanical systems of Eva apparently have it in for the pilots just as much as everything else.
      • Granted, this is justified to an extent by the fact the Evas are only half-or-so machinery. Their human will seems to be able to screw things over pretty hard.
    • Another example of it actually working would be in episode 13, when the viral angel attacks Nerv. When one of the Simulation Evas reaches for the Pribnow Box, an emergency shatter-and-pull mechanism blows the arm off, protecting the crew in the Box. The makers probably only allowed that because dying that way is not painful enough in Hideaki Anno's twisted, twisted imagination.
      • Protecting the crew? It fired the severed arm straight into the glass viewing window which started to crack and leak... forcing the team to abandon the situation AND room before they were drowned in plague infected water.
    • The Seeds of Life containing the progenitor Angels like Adam and Lilith come with their own failsafes to prevent two Seeds of Life from spreading life on the same planet. Each Seed comes with a control rod called the Spear of Longinus which will automatically seal a Seed if two Seeds are on the same planet. Lilith's Spear was lost when it crash-landed on Earth, so Adam's Spear was activated instead, rendering Adam dormant. This worked fine for a long while, allowing Lilith's progeny, humans, to flourish. Unfortunately, humans discovered the dormant Adam and removed the Spear.
  • Sword Art Online: Alternative Gun Gale Online: After the original Sword Art Online incident where the NerveGear headsets were used to kill players, the AmuSphere was invented, which is filled with safeties to prevent such a thing from ever happening again. Several players are therefore confused when Pito passes out while playing GGO; the AmuSphere should have automatically logged her out long before she got to that point. They then realize she's still using NerveGear. This demonstrates one of the biggest security headaches: Human error. Luckily for Pito, most of the really big flaws of NerveGear have to be actively taken advantage of by an outside party, and no one is interested in trying any more.

    Comic Books 
  • Green Lantern:
    • The rings actually do have several failsafes which kick in, shutting down if the wearer breaks Lantern Code, reserving a small supply of energy the lantern normally can't access to protect the wearer from mortal energy and so forth. But Lanterns have been able to override the latter failsafe to continue fighting after their normal reserve is depleted (and given that lanterns are selected for fearlessness, it seems silly to allow that.) Also Hal Jordan was able to override the former failsafe after his ring was depleted for insubordination by drawing energy from a Guardian's construct.
    • The Alpha Lanterns seemed to lack sufficient failsafes. While their minds were linked to the Book of Oa to make sure they faithfully executed their duties as Internal Affairs, there was no failsafe in place to stop them from being hijacked by Hank Henshaw, the Cyborg Superman, whose Kryptonian-based technology has traditionally been billions of years behind the Guardians (Oa had spacefaring civilization before Krypton or Earth had life). Possibly justified by Cyborg's machine empathy.
  • Two Spider-Man examples, involving the same robot:
    • In the aftermath of the Acts of Vengeance, the defeated Loki tried to get revenge by destroying New York by stealing control of three Sentinels built by Sebastian Shaw, and combining them into the titanic Tri-Sentinel, which he then ordered to destroy a nuclear power plant. As Spider-Man (who possessed the Captain Universe power) struggled to stop the thing, Shaw tried to activate a failsafe he had placed in the three Sentinels in the event they turned on him (As Sentinels often do.) Simply put, the program would reveal to a Sentinel that, since their abilities were "inherited" and improved upon from the original Mach-1 Sentinels, they are technically mutants. In theory, this would act as a Logic Bomb, causing a rogue Sentinel who has this revelation thrust upon it to destroy itself, as its directive is to destroy mutants. Unfortunately, Loki's sabotage had seriously screwed up the Tri-Sentinel's programming, and the failsafe didn't do anything more than confuse it for a couple of minutes. Still, that small delay was enough for Spidey to bring the Uni-Power to its full potential and blow it to dust in a climactic finish.
    • The Tri-Sentinel appeared again later, this time in the hands of Life Foundation president Carlton Drake who thought he could use the Tri-Sentinel as security for his latest "luxury escape condo". Knowing that Sentinels could regenerate, he had his men gather its remains, and when it was almost whole, he installed what he thought was a better failsafe: a chunk of the incredibly rare Antarctic vibranium (a substance that melts metal) placed it in a special container that nullified the melting effect, and put the container near the robot's brain, thinking he could simply deactivate the container if the robot turned on him. As you might expect, when he activated the creature and ordered it to crush Spider-Man, ignoring the hero's pleas, it quickly deleted the programming he had installed and resumed the mission Loki had given it, and when the panicked Drake tried to activate the failsafe, he found that it had installed a failsafe of its own, preventing him from deactivating the container remotely. (Because fighting it the way he did before was out of the question, Spidey made his way inside the creature, fought his way past internal defenses, up to its head, and deactivated the container manually. What happened next was ironic; the Tri-Sentinel clearly didn't even have anything close to a failsafe for dealing with the Antarctic vibranium as it started to melt its brain, because, as Spidey put it, "Nothing like this has ever happened to a Sentinel before." It tried to expend power to regenerate itself, but the vibranium kept melting it, forcing the Sentinel to expend more and more power, until it was literally vaporized.)
  • In Watchmen, Dr. Jon Osterman is trapped inside the Intrinsic Field test chamber by the door closing behind him when the automatic timer starts up the generators for that afternoon's experiment. As Dr. Glass puts it, "I'm sorry, Osterman. The program's locked in and we can't over-ride the time lock. It's... it's a safety feature." His last words indicate how horribly aware he is that this trope has come into play.

  • Defied in the Star Trek Online fic Bait and Switch. The USS Bajor is a late-model Galaxy-class with an experimental warp core that operates using minimal fuel in the chamber, meaning there's no need for dilithium to moderate the reaction and a shutdown consists of merely turning off the fuel. Also, as specified in the TNG Technical Manual that the various shows' writing teams apparently never read, its core ejection mechanism operates on the Dead Man's Switch principle and is actually an anti-ejection mechanismexplanation  Worth noting, the author is a troper and has read the above fanfic.
  • Invoked in the Harry Potter fanfic Make a Wish. Portkeys are made with a variety of safety features that prevent users from apparating in mid-air or inside a space too crowded or too small to contain them. The Death Eaters made the mistake of getting Portkeys from a wizard that doesn't like them, and is in fact giving them unsafe portkeys with the intention of getting them killed.
  • In The Maretian, in addition to Mark's communication problem, there's the cascading failure and shattering of all mana batteries aboard the Amicitas: There were failsafes in place to trigger an emergency shutdown, but they too were designed on Equestria, and thus dependent on the pony universe's Background Magic Field.
  • In Marionettes, it turns that this trope is why Trixie (who discovers she's an android) escaped the Stallions in Black who were chasing her's control: the Alicorn Amulet messing with her head fried her failsafes and freed her. Twilight later invokes this trope to free Lightning Dust the same way.
  • One engineer was so annoyed by the persistent number of Failsafe Failures in Star Trek that he wrote a fanfic about a leading engineer in the Star Trek universe being put on trial for negligence.
  • Tantabus Mark II: Of the "failsafe worked perfectly and we realized that's a bad thing" sort. When Luna was creating a second Tantabus to help her manage dreams, she made sure that it could never leave dreams and enter the real world—but if it did somehow end up in the real world, it would be completely incapable of so much as moving a single blade of grass. The Tantabus grew intelligent, friendly, helpful, and utterly uninterested in the real world, and Luna completely forgot about the failsafe. So when it ends up in the real world by accident, it freaks out because it is incapable of affecting the world in any way. Thankfully Twilight is able to fix things before it dies because it doesn't have access to any of the dream energy that it needs to live.

    Film — Live-Action 
  • The Alien franchise:
    • In Alien, the spaceship's self-destruct fail-safe mechanism is virtually impossible to initiate by accident, but it is just as fiendishly obtrusive to abort it. There is no quick reset button. It will not override without first manually disengaging safety interlocks and inserting the rods back in. If a last-minute-decision was ever made to abort, you're basically screwed.
    • In Alien: Resurrection, the Auriga is programmed to automatically return to Earth in case something goes wrong. Unfortunately, the Auriga is the site of an alien breeding and testing facility, which is the absolute last thing you want near an inhabited planet.
  • In The Andromeda Strain, the lab has a nuclear self-destruct device, with three substations (to disarm the bomb) per floor, but it's discovered they need five per floor, and are in the process of adding them, but they haven't been finished (this is a government installation, of course). When the self-destruct countdown is activated, team leader Stone, along with the only team member who has the shut-off key, are trapped in a section with an unfinished nuclear destruct shut-off substation. Stone cries out, "When the bomb goes off, there'll be a thousand mutations! [The virus] Andromeda will spread everywhere, they'll never be rid of it!" He touches the other team member and points at the exposed, unfinished shut-off substation. "The defense system is perfect, Mark, it'll even bury our mistakes."
  • In Capricorn One, a government agency specifically defeats every failsafe on Robert Caulfield's car. Not only do the brakes fail, the throttle gets stuck wide open, the gearshift is locked so he can't go into neutral, and even turning the key off won't cut the ignition.
  • For something less high-tech, Sylvester Stallone's 1993 Cliffhanger starts with Gabe (Stallone's character) climbing up a mountain to rescue friends Hal and Sarah. To get to the rescue helicopter they have to pull themselves on a line stretched across a chasm, suspended by their climbing harness and a carabiner (a big metal clip). When Sarah is in the middle of crossing, the carabiner starts to buckle; Gabe goes on the line to catch her, but he is too late and she falls to her death. The problem with that scene is that a carabiner is designed to withstand the weight of a falling climber. A standard one would have a rated strength of 23 kilonewtons, while the static load of a of a Hollywood starlet would would be around a 0.5 kilonewton. At that point, viewers who know their climbing safety equipment may feel their Willing Suspension of Disbelief shatter with the carabiner. The studio was actually sued by the carabiner manufacturer as a result of this scene and the blatant inaccuracy of it breaking in such a scenario; when it snaps in the film, there is a prominent close up of the carabiner with the manufacturer's logo prominently displayed, which certainly would damage the manufacturer's public image in its market. They won the lawsuit, and the credits include a disclaimer stating that the equipment was rigged to fail for the sake of the film.
  • Dr. Strangelove has a pretty much identical plot to Fail Safe (see Literature below), but the attack is a result of human intervention rather than mechanical failure (although it is a mechanical failure that prevents one of the bombers from being recalled).
  • Eve of Destruction: EVE III is implanted with a nuclear device, which can supposedly be detonated only by a number of complex signals from her handlers. Somehow, minor damage causes it to activate anyway, with only 24 hours before detonation. McQuade lampshades how stupid it was to send her out with a live nuclear weapon on a field test (which of course went wrong).
  • The B-movie Evolver involves a failed military robot (which ended up killing dozens of soldiers during a training exercise) being re-purposed as a household laser-tag toy. Needless to say, the robot reverts to its original programming and starts killing people. When its creator attempts to use the verbal shutdown code that worked during the training exercise, the robot simply rejects the override and kills the guy.
  • This sort of thing seems to happen all the damn time in the Final Destination franchise, to the point that it's a wonder that horrific freak accidents don't happen constantly. Some of these failures, though apparently outlandish, are still within the realm of possibility, however unlikely. This is largely justified as being the work of an unseen force, be it Death or Fate, deliberately tampering with the machinery to lethal effect in order to "balance the books" whenever someone survives a disaster they were never meant to.
  • Averted, subverted and justified in quick succession in the climactic scene of the movie version of The Hunt for Red October.
    • The Aversion: To 'sell' the appearance of having destroyed the submarine Red October to its recently-evacuated crew, the US Navy attack it with an air-dropped torpedo. The torpedo is successfully aborted before impact in the scene which became the Trope Namer for I Was Never Here.
    • The Sub Version: The commandeering of the Red October is interrupted by the arrival of a Russian Alfa-class attack sub, which launches a torpedo. Captain Ramius orders the sub steered into the torpedo's path at full throttle, closing the distance before the torpedo's warhead can arm itself - its safety features work a little too well.
    • The Justified Version: In response to this failed attack, the commander of the Alfa orders all safety features disabled on his remaining torpedoes. As a result, after playing tag with the next torpedo he fires, the commander of Red October is able to decoy it into locking onto the vessel that fired it, destroying the Alfa.
  • Michael Bay's The Island has two notable examples of this. A truck carrying a massive load of train wheels loses its entire load when a single strap is released, then at the climax, throwing a single breaker switch causes the entire mechanism to explosively fail. Then again, being a Michael Bay film, having things blow up is to be expected.
  • James Bond:
    • In Thunderball, an assassin tries to kill Bond by turning up the setting on a spine-stretching exercise machine he's strapped into. Bond blacks out and is only saved by a nurse happening to enter the room just in time. Leaving viewers to wonder why the hell the machine was even designed to be able to go that fast.
    • In Moonraker, a mook tries to kill Bond by disabling the chicken switch on a centrifuge and cranking the spin rate to unsafe levels. Not so much Failsafe Failure as intentional tampering, but why would a piece of equipment designed to test human endurance have the wires to the safety switch connected to a plug easily removable by the controller, and why would it go up to speeds considered dangerously unsafe for humans in the first place?
  • Judge Dredd: The Judges' Lawgiver guns can't be activated safely be anyone except them. Anyone else will be electroctued. Rico though, even after he's been convicted of murder, can still activate one. No one thought to remove his authorization when he was convicted? This could be justified though if his accomplice, Judge Griffin, kept him authorized to use it.
  • In Jurassic World, Owen and two other park employees try to escape the Indominus rex's exhibit. There is a door panel on the inside, allowing them to open the exit. While one worker gets out, the control room has to try and shut the door after that because the I. rex is too close behind Owen. The door closes very slowly allowing both Owen out and the dinosaur to stop it from closing all the way.
  • In Live Free or Die Hard, the bad guys blow up an entire natural gas facility by routing all the gas to it. There would actually be dozens of failsafes to prevent the necessary overpressure from breaking anything at all, much less exploding. Of course the bad guys used the power of Hollywood Hacking to pull it off, since computers are magical and none of the failsafes are purely mechanical, either.
  • A textbook example from The Machinist: a worker is repairing a broken machine when someone accidentally leans on the On button (which is only possible because the workshop has No OSHA Compliance whatsoever). Hammering on the Off button does absolutely nothing, and the repair worker is dragged into the machine and loses his arm. It's not clear what was wrong with the machine to start with, but it might have been a good idea for someone to disconnect the power before the repairman stuck his arm in there. It was made clear that the machine was supposed to be locked out, but the manager had previously reprimanded employees for taking too much time to get equipment fixed. That kind of pressure is definitely illegal, but happens more often than you'd like to think (especially in small shops with narrow profit margins). The reveal that the main character is insane and frequently hallucinating might explain it, however.
  • Outbreak:
    • A lab technician is infected with The Plague when he carelessly opens and reaches into a centrifuge while it's still spinning, breaking a vial of infected blood and cutting his hand. In Real Life, lids on most (but not all) centrifuges lock until the spinning has completely stopped; for these models, it's impossible to open one while it's still in motion.
    • In another example, a lab worker is infected when his suit tears (in the novelization, he's infected when his oxygen tube gets pulled from the suit's valve). In Real Life, such hazmat suits are generally kevlar-lined and require great trauma in order to be breached, while the valve the oxygen tub connects to is one way and requires said tube to be connected in order to let anything in.
  • In Passengers, it's frequently mentioned that the hibernation pods cannot fail because they have "too many failsafes". The pods actually have a pretty good failsafe in that when they break, the person inside is awakened safely, instead of dying. Unfortunately, that's not too helpful when it happens 30 years into a 120 year voyage.
  • In Robocop the attempt to have the ED-209 perform a test of a crime situation ends up having it kill the man simulating a hostage situation. Even though the programmers have a control console in the room, they are unable to shut down the robot before it riddles the target full of bullets.
  • This is parodied in Spaceballs when, after the Big Red Button is pushed activating the self-destruct, the computer says in the last few seconds that they can stop it by pressing a button to cancel. The button, of course, has a big "out of order" sign hanging on it.
    Dark Helmet: Out of order!? FUCK! Even in the future, nothing works!
  • Speed:
    • The end involves a Runaway Train with the emergency brake disabled, and no Dead Man's Switch. And it didn't trip any overspeed controls either. This at least can be Handwaved by the fact that Payne shot up the control panel (and the operator), possibly damaging things beyond repair.
    • Averted with elevator at the beginning. After Payne blows the cabels, the emergency brakes do exactly what they're supposed to and stop the elevator. The problem is that Payne has put additional bombs on them, and threatens to destroys them unless paid. Then Played straight when it turns out the crane Jack and Harry hooked to the car to secure it couldn't hold the weight either.
  • In Star Trek Beyond, the air processing system on starbase Yorktown has all kinds of elaborate safeguards to prevent anyone from tampering with it via the computer network. But a person can simply take an elevator to the roof of the building it's on and release a bio-weapon with ease even as people in the command center struggle to overcome those very security protocols to try to stop them.
  • In the Star Wars galaxy, as a rule, if you destroy a single control console for some piece of technology, that technology will immediately and completely fail. This can range from door/bridge controls (A New Hope) to the absolutely crucial deflector shields protecting a mining outpost on a volcanic planet (Revenge of the Sith). In Return of the Jedi, the 19 kilometer-long Super Star Destroyer Executor goes into an instant nosedive when its main bridge gets destroyed by a rebel fighter, with the thousands of crew members scattered throughout the ship apparently unable to do anything to prevent it. Expanded materials would elaborate the Executor does have a secondary bridge to deal with this problem...but the ship crashed into the Death Star II right below it before control could be rerouted in time.
  • In the notorious Irwin Allen disaster flop The Swarm (1978), the killer bees attack a nuclear power station, and cause it to blow up almost instantly when one of the technicians falls across a random instrument panel. Also the actual core is completely exposed to the air without any evident shielding.
  • Justified in The Taking of Pelham One Two Three, where the safety devices on a New York Subway train are actually a plot point. The police believe the Dead Man's Handle will prevent the villains jumping off the train while it's moving, but they've actually rigged up a system to hold down the lever. Later as the train appears to be careening out of control, it's eventually stopped by the safety devices built into the track.
  • Inverted in WarGames. JOSHUA doesn't have a failsafe, it has a "fail-deadly" Gone Horribly Right instead. The General asks why they can't Cut the Juice to the hacked nuke-controlling master computer. The computer tech explains that if the slave computers in the missile silos don't receive a signal from the master, they will assume that the master computer and NORAD are destroyed, and spin up and launch everything.
  • Wing Commander begins with the Kilrathi attacking an outpost in order to capture its navigational data. Realizing why the enemy is attacking, the commander of the outpost orders the navcom, which is located in a sealed room, destroyed. However, the Self-Destruct Mechanism refuses to work, due to sabotage.

  • Averted and lampshaded a bit in the book 2001: A Space Odyssey. The makers of the failsafes of the airlock doors had mentioned, "We can protect you from stupidity, we can't protect you from malice."
  • As above, The Andromeda Strain. The bacterium mutates and destroys gaskets... that are protecting the lowest, most secure level from being contaminated. A nuclear bomb is set to destroy the base, and since the bacterium mutates with levels of energy, it'll cause a worldwide outbreak.
  • In Dave Barry's Big Trouble, it is mentioned that the corrupt Mega-Corp built a new prison in downtown Miami using off-the-shelf garage door openers to power the cell doors open and shut. Someone accidentally hit their garage door opener button while driving by soon after the jail was filled, and every door in the place opened. Hilarity Ensued.
  • Averted in Children of the Mind, where an active MD Device is quickly and easily disarmed by a technician. The tech notes that the planet-atomizing superweapon was deliberately designed to be easy to turn off, with the missile having instructions printed on it to explain how to do so. "Now, turning it on, that's hard." Good thing it had been designed like that, as in this book the "Dr. Device" is fired at the planet Lusitania but the protagonists use their new teleportation tech to send it back inside the ship that fired it.
  • Subverted in Timothy Zahn's The Conquerors Trilogy. With a human ship about to fall into hostile hands, a failsafe is activated so that the computers are purged of data before it can be captured. It works, but it turns out the captain had a computer of his own in his desk not connected to the system on which the data was duplicated.
  • In the novel The Dorset Disaster, a nuclear reactor explodes due to this, because someone tampered with the settings that controlled when the reactor should SCRAM; in a bit of similarity to the Real Life Chernobyl incident, it was SCRAM-ing too often and annoying the people running the plant. So they changed the settings just to stop the noise, and that leads to a big kaboom.
  • In Sergey Lukyanenko's Emperors of Illusions (part of the Line of Delirium trilogy), Arthur van Curtis holds the command crew of an Imperial cruiser at gunpoint while the ship is in hyperspace. He orders the crew to prepare to drop the ship out of hyperspace without first slowing down. In this case, the ship enters normal space at relativistic speeds and, by the time it will slow down, decades or even centuries will pass for the rest of the universe. This has happened before, and yet nobody decided to make deceleration a standard part of dropping out of hyperspace instead of simply a step that a crewmember might one day forget to do.
  • Obviously Fail Safe (the book, movie and TV drama) qualifies. The American strategic nuclear forces have a system in place to prevent bombers from attacking the Soviet Union without clear authorization - bomber crews are conditioned to turn back at the fail-safe line no matter what is happening around them, if they haven't received the go signal themselves. In this case it's an "active go" system, designed to send out an attack order - and it does so incorrectly, due to a subtle and unnoticed technical fault.
  • When the Arrandas visit the Nightmare Machine at Hologram Fun World, in Galaxy of Fear, they're told that the simulations, which are terrifying, will end if they say "End simulation!" Of course, it doesn't work because the Nightmare Machine is actually a psychic monster, and it's not interested in letting them go..
  • Starships in Honor Harrington, especially warships, are built with numerous failsafes that usually work as intended. Occasionally though, Weber falls into this trope, such as when one of the circuit breakers protecting a fusion plant from power surges is itself knocked out, destroying the ship. Such failures are generally the result of combat damage or sabotage, as the engineers know that if the failsafes on certain systems (Notably the reactors and the inertial compensators) break down, they'll all be dead before they can try to fix it, so they make certain that critical systems are in good repair at all times to prevent spectacular accidents.
  • In the Discworld novel The Light Fantastic, magic is weakening on the Discworld. This causes people to riot against wizards. Good thing Unseen University has some big, heavy doors. Too bad the only locks are magic spells, with no good, solid steel lock.
  • Justified in the Thursday Next book Lost in a Good Book. The nanomachines that unstoppably convert all organic matter into Dream Topping are contained in an extremely strong electro-magnetic field. The field is maintained by three generators, all of which would have to fail simultaneously in order to release it, an astronomically unlikely possibility. They do fail, though, because the villain Aornis Hades has the ability to manipulate coincidences.
  • French Sci Fi novel Malevil briefly considers this. World War III occurs and nobody is certain why it happened, they lived through it and yet the lack of information and details turns it into the Great Off Screen War. One of the possible, never to be confirmed, theories as to why the world ended was Failsafe Failure.
  • In The Martian, in the events of a failure of the hab's main communication equipment, the Ares III mission had three redundant comm devices... unfortunately for Mark, they were all in the now-departed Mars Ascent Vehicle.
  • Norman Moss, in the book "Men Who Play God", makes it clear that this is not how Real Life works - the "go" signal is a voice order that must be given by a human being and cannot be transmitted accidentally. Nevertheless it remains a cautionary tale against the dangers of too much automation in military systems, a thing the American President and Nikita Krushchev are left bemoaning to each other near the end of the book, as the last bomber closes unstoppably on Moscow...
  • Mentioned in Mostly Harmless:
    • Subverted: it is explained that a set of special bulletproof windows are not designed to be shot at from inside. They can also be jimmied open with just a credit card. This is because of 'The Great Ventilation and Telephone Riots of SRDT 3454'. The main cause of the riots was a building environment control system. Part of the installation process involved sealing the windows shut, to make it easier for the system to do its work. One particularly hot day, many of these systems broke down, resulting in the overheated office workers taking to the streets. As a result of the riots, buildings were required to have windows that opened.
    • In the same book, it's mentioned that the difference between something that might go wrong and something that "cannot possibly go wrong" is that when something that cannot possibly go wrong goes wrong, it's usually impossible to fix.
  • In The Railway Series story of Henry pulling the Flying Kipper, a set of points were frozen stuck by icy weather and a corresponding semaphore signal was supposed to be signaling the train to stop. Snowfall forced the signal to move down to "all clear" resulting in Henry's train to be switched to the wrong line and crash into another train. This was a real-life problem with lower-quadrant (down=go/up=stop) semaphores in British railways at the time which later resulted in them converting to upper-quadrant (up=go/down=stop) to fix this.
  • In the climax of The Shining (the Stephen King book), lead character Jack Torrance desperately tries to cool down the main boiler of the Overlook Hotel, while Danny, Wendy and Mr. Hallorann escape on a snowmobile. At first, it looks as though the boiler (which has to be constantly maintained) will return to acceptable levels, but the pressure is already too great, and the boiler blows up, taking Jack, the hotel and the topiary animals with it. Justified in this case as the boiler is explicitly described as both very old and very dangerous; the hotel manager has been bribing the safety inspector for years to keep it from being forcibly replaced.
  • In The Stand, the engineered superflu virus nicknamed "Captain Trips" is accidentally released from a top secret installation in the High Mojave, and, unfortunately for the rest of the world, a security guard is able to escape because the doors to his station (which he thought erroneously to be "clean") did not magnetically lock at the moment of the installation's containment breach. The guard takes his family and flees, making it all the way to East Texas before dying. General Billy Starkey, the man charged with the containment operation, later comments on this fact.
  • In the Starfleet Corps of Engineers series, a Federation space probe in one story (actually entitled Failsafe) suffers from this, requiring the crew undergo a mission to retrieve it from a pre-warp planet. Sonya Gomez even seems to lampshade the improbability.
  • On Star Trek: The Next Generation: A standard Holodeck Malfunction episode goes like: Computer, freeze holodeck program! (pregnant pause) Computer, exit! (slaps combadge) Picard to Bridge! (silence) ...Oh. Shit.
    • Averted in the DS9 novel Valhalla. A sentient, suicidal starship tries to blow itself up by running its fission pile too hot. (Un)fortunately, a mechanical failsafe triggers, wrecking the drive in the process.
    • In the Star Trek: Deep Space Nine novel, Time's Enemy it is shown that the self-destruct command for Jem'Hadar ships is simply "Destruct" in their own language. There is no override, there is no countdown. This is simple and, with the mindset of the Jem'Hadar, the perfect method — since Jem'Hadar learn foreign languages ridiculously quickly and never use their own language in the presence of aliens, there's no risk of an intruder arming the self-destruct.
  • In both the book and the movie of The Taking Of Pelham One Two Three, the criminals hijack a subway train, and demand that all of the signals all the way to South Ferry be set to green. The train then rolls through every signal and station, at speeds exceeding 80 miles per hour. Note that the train does stop when it gets to South Ferry as it is going too fast for the turn and trips the overspeed control. However it's interesting that it never tripped any overspeed control along the way.
  • A Warhammer 40,000 short story by Graham McNeill had an ultra-maximum security prison in which all the doors automatically unlocked in the event of a power outage.
  • In Zeroes, a police station has holding cells with electronic locks. They're supposed to remain closed and locked even in the event of a power failure; but when Crash uses her Walking Techbane powers to destroy the police computer systems, the failsafe fails and a number of prisoners escape.

    Live-Action TV 
  • The 100: All of the events of Season 4 and onwards are specifically caused by this. A century after a nuclear apocalypse, the remaining nuclear power plants not destroyed by the bombs have simultaneously begun to melt-down due to lack of maintenance. Raven cites the fact that they were designed to last a hundred years with no human crew, but they've previously hit the deadline to reverse it and there is now no stopping another apocalypse. Note that in real-life, Nuclear power plants are designed to shut themselves down if their human technicians evacuate, not keep going for a century.
  • 24:
    • Played absolutely straight in Season 4, where Marwan The Wonder Terrorist manages to steal a MacGuffin that can cause every nuclear reactor in America to go into simultaneous meltdown. How one electronic device could untraceably hit over 100 sites at once AND bypass the dozens of failsafes, manual breakers, and shunts present at every site, they don't even try to explain (let alone WHY an American company would make such a thing).
    • Inverted in Day 8 where a suicide bomber blows up because of the failsafe.
  • The Andromeda Strain: The underground Wildfire lab made for scientists to study the titular Andromeda Strain has a Self-Destruct Mechanism in order to contain The Plague in case of a breach. Due to Charlene being blackmailed to save an Andromeda sample by the Government Conspiracy, the Wildfire lab system detects a breach and activates its Self-Destruct Mechanism, but due to a nuke only accelerating Andromeda's growth, this only puts the entire world at risk.
  • Averted in Angel, where Lindsay tried to activate Wolfram and Hart's failsafe, but Angel's people stopped it from getting loose.
  • In one episode of Battlestar Galactica, two characters stuck in a leaking airlock are told that the "manual override" for the door had failed. This provided an excuse to space the pair in an over-the-top CG-fest. It's good to know that no major space craft will ever carry a crowbar. The fact that the Galactica was old when the series began and has gone several years (and multiple battles) without adequate maintenance at this point provides a thin veneer of justification, but it's very thin.
  • As in the real life event, Chernobyl had a flawed failsafe. Making a really long story short (the full version can be found in the real life section), when AZ-5 was engaged, the control rods would eventually do their job, but not before causing a power spike. Reactor #4 was already experiencing a runaway reaction (caused by human error), and the power spike was just enough to break the control rods, preventing them from cooling the reactor, and ultimately causing the explosion and full meltdown. Now, the Soviet authorities knew RMBK reactors had this flaw, but covered it up so as to save face. This one lie meant that nobody who worked at the plant knew that the AZ-5 would be worse than useless in their situation, and it's speculated that Dyatlov (the guy in charge at the time) pushed the reactor beyond its limits because he thought he had a failsafe, not a detonator. The mechanics behind the explosion (and the politics behind those mechanics) are discussed in episode 5, which focuses on the trial of Dyatlov, Bryukhanov, and Fomin.
  • Doctor Who: In "The End of Time", this is ultimately the cause of the Doctor's regeneration. The Nuclear Bolt cabinet is designed with a failsafe such that if it's overloaded, any excess radiation is vented inside of the cabinet, which has radiation-proof glass so it won't leak out. Unfortunately, the cabinet was also designed to only be able to operate if one of its two booths is occupied, and after being overloaded it's too heavily damaged for the Doctor to even sonic it open to rescue Wilf, who is trapped inside, without irradiating it. So he chooses to go inside and take the radiation, which is so much that it causes him to regenerate, to save Wilf's life, even thought the old man protests that he doesn't have to.
  • On an episode of Eureka a weaponized attack drone went out of control and started targeting the defense system it was designed to test, leading to the following:
    Hit the failsafe!
    Martha isn't responding to any of my commands.
    • Subverted in the Clip Show episode "You Don't Know Jack". Jack and Allison are trapped in a Section 5 lab with a Sonic Sterilization about to go off. Allison hits the failsafe but nothing happens prompting Jack to say "Tell me the failsafe didn't just fail". The failsafe worked as intended, sending an abort signal to another scientist's PDA who would then deactivate the system, but unfortunately he had lost all his memories and didn't know the beeping PDA was his or what to do with it.
  • Explored in the Mayday episode on the Hinton train collision. Averted in the case of the signal lights; the investigators wonder if they might have been green instead of red due to a mechanical fault, but an electrical engineer notes that "a fault does not give a positive green light to any situation … if there was a fault … it would have forced everything to go to red." Sadly played straight with the dead man's pedal (see the Real Life section below), which is believed to have contributed to the accident.
  • In the season two finale of Mission: Impossible the team has to retrieve a failsafe device from a B-52 that failed to self-destruct when the plane was shot down by an East Bloc nation. The reason they have to secure the device rather than just destroy it to prevent the Communists from reverse engineering it and figuring out a way to prevent other failsafe devices from working is because the original manufacturer needs to take it apart so they can figure out why this device failed to self-destruct and ensure that the other devices from that production run don't suffer from a similar defect.
  • Referenced in the Mystery Science Theater 3000 episode Gamera:
    Soldier: The electrical shocks don't seem to bother Gamera at all!
    Servo: Hm, and I was counting very heavily on them...
  • MythBusters: If they're not going to ridiculous lengths to defeat the failsafes on some common household item so they can replicate a myth's results (translation: make something explode), chances are you'll find them putting out a fire or running to catch a driverless vehicle because one of the failsafes on something they made has failed.
    • Specifically, the tendency for their remote controlled cars to run wild and take out the chain link fence of their test area has become a Running Gag. The failsafe is supposed to apply brakes to the car if it loses radio contact. Some of them were genuine failures, some of them were somebody forgetting to set it. One time Jamie forgot to set it before Adam jumped in for a ride...and scratch another fence.
    • Sometimes they do consider a Myth "confirmed" if the fail safes preventing the results they wanted could reasonably be disabled by a normal person. For example, the "water heater rocket" myth required them to close off a safety valve that would prevent the thing from launching through the building and into the sky, since to a normal person, the water it drips may appear to be just draining their water bills for no reason.
    • In one instance they had to disable the safeties on an elevator. Adam eventually reported, "Anticlimactically enough, I believe I've disabled the entire mechanism by removing this simple pin." Granted, it was a very old elevator in a condemned building already slated for demolition. Wonder why...
    • One subversion: In an episode dealing with a car bumper being fired like a rocket due to the shocks failing, the guys tried everything and were defeated by at least 4 failsafes. They then talked to a person who had had both legs shattered by the exact failure happening.
  • Justified in an episode of The Outer Limits (1995); the technicians who created a planet busting bomb are abducted and being tortured by aliens to provide information on the bomb. The bomb itself is sitting in the room. The technicians find out how to bypass all of the safety measures and arm the bomb... only to find out they are on Earth, undergoing a psychological stress test. Of course, they used the real bomb.
  • This also happens in an episode of The Professionals. The components of Doyle's car are set to fail one by one as he's going down a hill. Justified though as the killer isn't trying to fake a car accident; he's just playing with Doyle before he kills him.
  • Red Dwarf:
    • In the episode "Demons and Angels", Lister says the ship is full of fail-safes; "The actual chances of it exploding are one in—" Red Dwarf explodes. "... one."
    • During the Time Skip in after the series prologue, Rimmer also managed to flood the entire crew compartment of the ship with lethal radiation, because he conducted a repair without proper assistance. Ironically, the cargo decks were safely sealed during the event. Why Rimmer was even allowed to touch the ship's nuclear reactor, especially without assistance, is never clarified, but Kryten later successfully defends Rimmer in court with the defense that the person responsible for the accident was the one who gave such an obvious incompetent a job that important.
    • The novelisations have the accident unfold slightly differently, and notably it isn't Rimmer's fault.note  Three important warning lights fail to activate, and when several more come on later the engineering watch-stander blames them on the coffee he just spilled all over his keyboard.
  • It's incredibly common in Stargate SG-1, as well as its sister series, Stargate Atlantis and Stargate Universe.
    • The page quote comes from the season 8 episode "Avatar", in which Teal'c is trapped in a virtual reality training simulation, and the failsafe to deactivate the simulation just resets it instead.
    • The Stargate system itself has plenty of failsafes, however, as presumably it was designed to be used by other races, without such devil-may-care attitudes about personal safety. Unfortunately for the SGC, they're operating their Stargate without the standard control device, and feel free to completely disregard any warning signals from the Stargate itself. After seeing the results of ignoring one such safety feature, they comment they should probably stop ignoring them.
    • There is a safety to prevent the gate from closing when something is halfway through - on one occasion, Jack holds a gate open (to prevent the villains from dialing another location and escaping) by not withdrawing his arm from the event horizon after he arrives. The only time there is a threat of something getting cut in half is when the gate hits the 38 minute time limit, which no failsafe could possibly prevent, since except under special circumstances it's physically impossible to keep a wormhole open longer than that.
    • There is a also a safety feature that ensures that an object is not transmitted through the system until it has entirely entered the gate. Otherwise when someone stepped into it, their face and feet would be sent through before the rest of them. This one fortunately never fails, which was the source of the drama in an Atlantis episode where a ship was suck partially in a space gate.
    • Also commonly inverted in Stargate Atlantis, where Atlantis often activates failsafes during bad situations. It inevitable does exactly the wrong thing for the given situation, forcing McKay to waste precious time overriding the failsafe or killing someone when he fails to.
    • The Pegasus Galaxy also has Spacegates, and there's no way to tell whether you dialed an address for a gate on the ground or a gate in orbit besides stepping through (or checking the Ancient database, but not every civilization has that option). The Pegasus network was designed for Lantean spaceships, but if a failsafe was ever needed in their designs, it's here. It was established a few other points in canon that there's a whole lot of "error codes" that the gate sends back to the DHD, and it's set up to not allow things like water rushing through from an underwater environment (while still allowing solids). Since the system isn't completely understood by anyone, it's possible there's a "Are you really sure you want to step through into space" warning light.
    • Failsafes built into the Stargate network, as well as other Atlantean systems, have been mucked around and hacked so much by the Atlantis expedition (in order for Earth PCs and computers to interoperate) that they might as well not exist. Consider "Avenger 2.0" where Felger's code is none too secure, or where McKay's software updates "broke the Gate". Of course, it wasn't really broken, but nevertheless, there were glitches because of this (hence Sheppard's 40,000 year trip into the future and then some).
    • Also, the Ancients aren't big fans of failsafes. Ancient devices need to have a function as complex as time travel before someone will even consider putting in a failsafe. The failsafe that prevents Atlantis from being crushed underwater after it has run out of energy wasn't even included until someone traveled to the past from a future where the city and its occupants were flooded, for instance... Quite natural because the Lanteans probably didn't forsee that they would submerge the city on the bottom of an ocean and then leave it unattended for 10,000 years.
    • Atlantis itself is the mother of all failsafe failures, a spaceship without an airtight interior. If the shield fails in space, everyone dies, as the crew unpleasantly discovers when they run into power problems while moving it to another planet.
    • The fact that you can remove the safeties on a ZPM is a biggie, you'd have thought that those things would have built-in safeties.
    • The Destiny's emergency atmosphere-retention forcefields are only strong enough to reduce the flow of air through gaping holes in the hull, not stop it. It's possible that they used to be stronger but are now past their best-by date. They also may not have been designed to compensate for such extensive deterioration.
    • One jammed-open airlock door is apparently enough to drain the air from all of the remaining habitable areas of the ship, with no other doors capable of being sealed to further compartmentalize the area. Granted, the ship has already had a lot of sections sealed off for this very reason, so perhaps this is simply the case failsafes can be driven past their limits by repeated failure. But one would think that having an airlock and the Stargate in the same emergency-seal compartment would be a bad idea.
    • In a case where a working failsafe caused problems, the shuttle docked at said airlock wouldn't close its own airlock door without someone inside it to operate the controls. This probably prevents you from locking yourself out, but in this case meant a Heroic Sacrifice was needed to stop the shuttle from leaking all of Destiny's air.
    • Destiny's atmosphere recyclers packed it in over the millennia and the ship automatically stopped off at a planet where needed chemicals could be found to repair it. But the ship's autopilot was still set to take the ship back to hyperspace after a fixed period of time, whether or not the chemicals had been recovered by then.
    • This trope is part of the show's stock in trade. Most of the drama is derived from either yet another of Destiny's millenia-old unmaintained systems failing catastrophically, or the failsafes working, but none of the human crew knowing what the failsafe procedure actually is, and everyone panicking. When the proper operating procedures include diving into a sun to refuel, the panic is understandable.
    • Averted in one case with an electrified corridor. There's a handy manual breaker just to make sure you can kill the power. However, the fact that a corridor can, without warning, turn into an electrified deathtrap, invokes another trope.
    • It gets better, the ship's capable of getting into your head. In "Trial and Error" when Destiny sensed Young was going through a mental breakdown, it initiated a program that caused Young to continually have vivid dreams and hallucinations of the ship being destroyed, evaluating if he had the ability to continue commanding the vessel. Let that sink in... the ship saw someone suffering a mental breakdown and felt the best solution was to do something that nearly drove him insane.
  • This happens all the time in Star Trek, where fail-safes are almost never shown or mentioned unless they fail. One fan group came up with the name "SINEW" for this phenomenon: "Somehow it never, ever works."
    • The warp core is chronically incapable of being shut down in case of emergency. This gets to the point where, in the later series, systems are built into ships to eject the core from the ship when it's about to explode. These also always fail.
    • The classic Holodeck Malfunction requires four conditions that you'd think would be unlikely to all happen at once: the exit door, the off-switch, and the safety protocols must fail, and everything else must continue to work perfectly. There is no easy way to turn it off even from the outside, as for some inexplicable reason the holodeck is the only system on a Starfleet ship to have an independent power supply, and they can't shut it down any better than the warp core.
    • In "Elementary, Dear Data," the Chief Engineer has sufficiently high administrator privileges on the Enterprise-D's computer that he can not only accidentally lock himself out of the system, but also everybody else all the way up to the captain! Even worse, this also allowed an accidentally-created sentient hologram being generated by the computer itself to gain those same absolute administrator privileges!
    • Data is able to seize absolute control of the ship's computer by virtue of the fact that, having sophisticated speech capabilities, he can precisely mimic Picard's vocal patterns and fool the voice biometrics authentication that the computer uses. This is in spite of the fact that the computer tracks the location of all members of the crew in real time, and thus must be aware that Picard is not on the bridge even though he's supposedly issuing orders from that location. A simple retinal or facial recognition scan would have foiled or at least seriously hindered Data's efforts. It's really unsettling just how easily one android hijacks the entire Enterprise, and none of the security holes this reveals are ever mentioned or shown to be fixed. (TNG: "Brothers")
    • In the season one finale of Star Trek: Voyager, the "manual" override on a door lock is shut down by a power failure, negating the very purpose of a manual override in the first place.
    • In the episode "Unexpected" of Star Trek: Enterprise, the handrail on the lift in engineering is capable of severing limbs, as there is no cut off if there's resistance. The only person who views this as a problem is supposedly irrationally anxious due to pregnancy. The person he's talking to questions why anyone would put their hand on the handrail.
    • The brigs on Starfleet ships default to open in case of power failure. Even if they need the force field to prevent "jail break" via transporter, there's no reason they couldn't also have a thick steel door (not getting into the many natural minerals that have stopped transporters). There isn't even a backup power supply. The same applies to the medical quarantine in sickbay, and to the force fields that are frequently placed around hazardous life-forms and other suspicious specimens. This is not to mention the fact that force fields are also used to seal hull breaches in combat, meaning that in the event of power loss, which is likely to occur if one's shields are down, the ship is open to space.
  • Combined with The Guards Must Be Crazy in the Supergirl (2015) episode "The Darkest Place", where the Fortress of Solitude's security robot, Kelex, is fooled into thinking Hank Henshaw is Kara by Hank dumping a vial of blood onto the Fortress' console, even though Kelex is looking right at Hank and should see that he isn't her. Kelex then attacks the real Kara by assuming she was the one who accessed forbidden data, forcing Kara to destroy the robot (Kelex is rebuilt later).
  • In the Supernatural episode "Devil May Care" (S09, Ep02), Kevin is locked in the bunker unable to contact anyone after the sensors detected the falling angels.
  • Averted in Terminator: The Sarah Connor Chronicles wherein a nuclear power plant's failsafes DO kick in, but a T-888 deliberately sabotages them one by one.

  • Discussed on several episodes of Well There's Your Problem, as failsafes failing are often involved in historical engineering disasters. Examples include the bugged anti-stall system in the Boeing 737 Max (the plane behind the Qantas Flight 72 accident in the Real Life section), which the hosts also note was an undocumented feature that the pilots couldn't switch off without disabling large parts of the plane's navigation system.

    Puppet Shows 
  • In most episodes of Thunderbirds, disasters were caused, or at least not averted, by faulty safety equipment or poor engineering. Examples included bridges that collapsed as soon as their maximum load limit was exceeded, aircraft whose nuclear reactor shielding failed if the flight was delayed, failure to survey sites properly before beginning major engineering projects, and numerous vehicles without a Dead Man's Switch or equivalent.

    Tabletop Games 
  • Subverted in Paranoia, as Friend Computer is pleased to report that, before being deployed on Troubleshooter missions in Alpha Complex, any theoretically-dangerous devices, accessories, networks, REDACTED and/or systems are uniformly equipped with unbreakable failsafes which have been rigorously tested in the field by dedicated and expert Troubleshooter teams.

    Video Games 
  • The game 7 Days a Skeptic features, on an advanced space ship, an escape pod door that opens whether or not there is an escape pod that can be boarded behind it. Almost needless to say, if there's no escape pod, one is greeted by hard vacuum. There's also the fact that the escape pods need several hours to charge up, which completely ruins their intended purpose.
  • Ace Combat 5: The Unsung War: During an airshow turned firefight, Chopper gets shot up by enemy forces, damaging most of his plane's internal systems. He still flies for a minute or so to find a safe place to land, but by that time, his eject system is damaged too heavily for him to eject. He dies in the following crash.
  • Brave Fencer Musashi: Whoever designed Steamwood is NOT an engineer. Any damage at all to it results in catastrophic pressure build-up which can only be released with eight separate valves on separate floors which have the most convoluted method of operation seen even in a video game. I'm surprised Grillin' Village is more than a steaming crater.
  • One of the audio logs in Doom³ tells the story of a technician that had his arm pulled in and shredded up to the elbow by a plastic extrusion system. The machine was properly shut down and the employee had the safety key out of the machine and in his pocket, but it turned on despite the key's failsafe and without an apparent power source. According to the log's narrator, that incident is "just one in a pile". Justified by how the teleportation experiments were connecting the base to Hell – it's not hard to imagine a malevolent spirit or demon wreaking havoc with the machinery and using it to hurt and kill people.
  • An Interactive Fiction game titled Fail Safe puts the player in the role of a computerized emergency help system. Someone in a crisis is calling for help on the radio, and you have to consult your database and give him instructions on what to do. There is a twist of the Tomato Surprise variety.
  • Fallout 2:
    • A thorough letter was found about what would happen if someone disables the Oil Rig's reactor cooling systems: a megaton-sized meltdown (not that it's possible in real life, but then Fallout's nuclear physics have always ran on the power of imagination). Guess what you need to proceed: hack the control system or simply blow it up. Gecko's reactor also counts since you are ordered to shut it down; you can interpret it as fixing the coolant leak or running into the active zone and turning a big red valve to shut down the cooling system altogether, causing a nasty meltdown and forcing a whole city to relocate. Now then, why would anyone make the cooling system controlled by a single valve and more importantly, why would anyone place that valve into somewhere you can't reach it without being roasted alive?!
    • There's a robot you can control in the Gecko reactor, so it's not all that mind-boggling. And the ghouls are immune to radiation, so again they wouldn't mind waltzing in and turning the valve.
  • Fallout 3
    • Project Purity: when you finally mow down the Enclave defending it, Li warns you that it was damaged in the fighting; the tanks are experiencing critical overpressure and the whole thing will explode unless the purifier is turned on to relieve the pressure. Thing is, another failsafe blew out and the control room has a fuckton of radiation inside so it's going to be a one-way trip. Earlier, you can read a message that Vault 87's GECK room has radiation purge systems but they can't deal with all that radiation coming from the outside and are constantly failing.
    • Or the DLC "Broken Steel": two confirmations and no password protection whatsoever are NOT enough to prevent the Mobile Base Crawler from calling down an orbital strike on itself, is that too hard to realize? If you have a really determined foe on a Roaring Rampage of Revenge toting a BFG, a platoon of infantry with air support isn't foolproof either. Clue: the Enclave learned this the hard way.
  • Fallout: New Vegas has one implemented due to carelessness before the bombs dropped in the Dead Money DLC. Before the bombs fell, the Sierra Madre was set up to broadcast a distress signal on its external radio antennas if something catastrophic happened. Unfortunately, since all the radios were set to broadcast music for the Grand Opening until after it was finished (three guesses as to what happened before the Grand Opening), the 'emergency' broadcast is still a broadcast inviting people to the Grand Opening Gala Event at the casino proper. Since the broadcast was designed to repeat until help arrived, this has led hundreds of unwary explorers to their deaths over the years...
  • In Five Nights at Freddy's, the doors to the office open when the power runs out. While you would want that to happen in every other instance, here it spells doom since there are four homicidal animatronics in the building with you, and Freddy will always take advantage of this to get you.
  • Half-Life:
    • At the beginning of the first Half-Life when things go awry with the Anti-Mass Spectrometer, the classic exchange is heard:
      "Shutting down... attempting shut down... it's not... it's-it's not- it's not shutting down, it's not—" (screaming)
    • In Half-Life 2: Episode One, the Citadel's dark energy reactor had a failsafe that was deactivated by the Combine as part of a Xanatos Gambit. Reactivating the failsafe doesn't prevent the reactor Going Critical as its condition had already deteriorated, but it slows the process enough to allow the rebels to evacuate.
  • Horizon Zero Dawn:
    • A whole string of poorly-thought-out failsafes caused the destruction of human civilization. The Faro Swarm, a legion of powerful war robots invented by Faro Automated Solutions, could be deactivated if they went out-of-control by transmitting a shutdown command to them. But then a computer glitch severed the chain-of-command protocols for the Swarm, causing them to stop receiving orders... including the emergency shutdown command. CEO Ted Faro ordered the machines to be hacked and forcibly deactivated, but because of the ultra-tight encryption he insisted on giving them (another failsafe), they were basically unhackable. In addition, the Swarm was designed to be capable of both self-repair and self-replication as a failsafe against damage in the field. The end result? A massive and ever-growing horde of resource-eating, artillery-carrying deathbots waging genocidal war against the entire biosphere of Earth out of a mindless drive to replicate. Needless to say, humanity could not win and instead devoted what resources they had left to Project Zero Dawn, a process to re-terraform Earth after everything was destroyed. Which led to...
    • GAIA, the artificial intelligence that managed Zero Dawn, having her own failsafe, HADES, fail on her. If GAIA didn't get the biosphere right on the first try, HADES would take over, un-terraform everything back to zero, and give her control again, since she was too much of an All-Loving Hero to do it herself. However, she did get everything right on the first try, and HADES wasn't needed. But twenty years before the start of the game, an outside signal upgraded all of GAIA's subroutines to full sapience, and HADES immediately began implementing his extinction protocol. GAIA detonated her core to stop him, while also initializing the creation of a clone of her creator, who would be able to bypass genetic locks and reboot GAIA. Except HADES managed to escape and continue his work while also corrupting a number of files, including the Alpha Registry on the door the clone would need to open. Meanwhile, the clone was successfully created and taken in by the primitive tribals living nearby, only to be forbidden from going anywhere near the ruins she needed to enter because of their religious beliefs surrounding the remains of the old world. GAIA fell into despair as she realized her last gambit would fail... before deciding to put all her hope into the clone anyway.
  • Metal Gear Solid:
    • There's a failsafe failure that is actually part of FOXHOUND's plans. You receive a key which is part of the override. But the game wants three keys for the override. Thus, you need to heat the key up to make it the shape of the second slot, then cool it down for the third shape. It's kind of lucky that the base holding Metal Gear REX contains a foundry, and is in Alaska, otherwise how else would you use the key? This is made even more baffling when you consider that the nuclear missile will launch with only two codes, yet three keys are needed to override and even worse, said override key will also actually arm the weapon if it's in a disarmed state - and the key combination can only be entered once.
    • More intelligently integrated into Peace Walker, which is a "fail-deadly" nuclear device designed to guarantee a retaliatory nuclear strike in the event of an attack, creating true MAD (the idea being that humans might not be able to launch a retaliatory strike, knowing they'd wipe out an entire country). Unfortunately, the flaw is that it doesn't have a way to check and see if nukes are actually being launched, and the story revolves around someone feeding the machine fake data in order to make it strike.
  • Might and Magic VIII plays with it. From your perspective, the failsafe itselfnote  is the failure (as it will cause the destruction of the world, all for no gain). From the perspective of the ones that implemented the failsafe, the failsafe works perfectly, you just happen to be collateral damage (as the reason why the failsafe is there is to stop the Kreegan from subverting Escaton).
  • In a twist of irony, the fail-safe prevention in the Panzer Dragoon universe is humanity itself. Various organisms, notably Coolias, have the inbred genetic potential to mutate into dragons, mentally overwritten by the Heresy Program to target and destroy the Towers and end the Ancients' terraforming plans. Unfortunately, dragons are seen as bad omens and abominations in general, and they are summarily executed by humans when discovered.
  • In the Portal series, we find that GlaDOS had an emergency red phone in the control room, so scientists could call if she had problems. Too bad the cable was cut, preventing anyone from calling to report she was about to kill everyone with deadly neurotoxin.
  • The Resident Evil series has at least one in every single game, mainly for the purpose of locking you into whatever room has the latest mutated/undead boss. As well, even the Failsafe Failure fails as the door lock mechanism invariably releases the second you kill the Big Bad du jour. The best example of a needlessly complex failsafe comes from the train braking system in Resident Evil 0, i.e. the brakes that are supposed to stop the train crashing into something, derailing due to an obstruction on the tracks, or simply slowing down at its destination... the sort of thing you'd want to be able to activate immediately. To activate the brakes, you must (capitalizations for emphasis):
    1: Find the brake instruction manual in the front car.
    2: Pick up the card key for the brake system.
    3: TRAVEL THROUGH THE ENTIRE TRAIN to the brake-lock system located outside of the caboose, i.e. on the completely opposite end of where the train's control systems, brake manual and brake lever are.
    4: Insert card key.
    8: (really, the only step that should be here) Pull brake lever in the front car.
    • Slightly hand waved in the RE storylines in that the designers of pretty much everything in Umbrella Corp. and Raccoon City are utterly insane - and those who weren't routinely got locked away or just plain murdered. Overly-complicated puzzles for everything became the norm in the name of "security" or because it looked cool, not designing things to save lives or prevent catastrophes.
  • Referred to by name in The Stanley Parable. In the explosion ending, there's a button that, when pressed, shows a message on a screen reading "Failsafe Failure". No button in the room can stop the explosion and save Stanley.
  • In the Infocom game Suspended, the player wins a lottery to function as a fail-safe for the global weather control systems. Of course, when everything goes wrong, it turns out the player's robots are broken, the repair center is a mess, your provided documentation contains errors, and nobody actually told you how to find out what the problem is or how to fix it. Of course if you are taking too long to fix the problem and the casualties are piling up, actual repairmen will show up to deal with the issue. They start by turning you off — as it's assumed these failures could only occur if you were causing them.
  • Subverted in the opening of Xenogears. When an alien threat is taking over the ship, the command crew attempts to cut power using the emergency blocker, which is a 3 foot section of the power cables which jettisons itself out, creating a massive break in power and electronic communication. Unfortunately the alien threat manages to arch the gap and continue its takeover. Their second failsafe (self detonation) works.

    Web Animation 
  • Sarge's quote in the quotes section from Red vs. Blue highlights his tendency to build machines with "failsafes" that end up backfiring on him somehow. For example, the bomb that he built into Lopez could be armed remotely, but Sarge designed it so that he himself couldn't disarm it, just in case he was captured and brainwashed into helping the Blues. Brilliant.
  • Parodied in the Strong Bad Email "cliffhangers". Strong Bad, in his sci-fi persona of Space Captainface, orders his trusty engineer "Strap" Coopmore (the Cheat) to activate "the forward humbuckers" and prevent their ship from colliding with a comet. The Cheat points to a sign reading "The forward humbuckers have never worked."

  • In Freefall
    • #2268 is an aversion; the failsafe works, but the user still complains.
      Stupid computer! Security should not fail safe! Security should fail dangerous!
    • Florence was inspired to become an engineer by an incident when her wheelchair-bound human companion was stranded outdoors in a cold rainstorm. The motorized chair disabled itself to prevent water damage to its electronics, not considering that water damage to the operator could be more serious.
  • In Katamari, cousin Ichigo's attempt to turn off a potential Doomsday Device is halted by a rather important design flaw.
    RoboKing: Wait patiently and watch as I earn my place in history through peerless cunning and technological prowess. Please ignore the fact that I couldn't build a fully functional on/off switch.
  • In Larp Trek, Picard's mystery hinges on the idea that, while a holographic knife would dematerialize before piercing flesh, a real knife could be used by a hologram to stab someone. Geordi thinks, and then desperately hopes, the holodeck doesn't work that way.
  • Schlock Mercenary:
    • A nuke hitting a unifield shield is dangerous, but can easily be handled by competent technicians. Unless they have a boss who thinks he knows better.
      Narrator: The House Phica primary generator was designed to take a lot of punishment. A direct hit with a nuke, even inside the gravitic shield it powers, will not of itself cause a loss of containment. For that to happen, the reactor techs would have to spend several minutes doing everything exactly wrong.
      Reactor Tech: Protocol T-21: Executing step-down and bleed.
      Reactor Boss: Drown protocol! We're under attack! Full power to the shields!
      Narrator: Exactly.
    • A spy has an Augmented Reality mask that is supposed to kill anyone else who puts it on. Not only does it fail to explode, but it doesn't log out when it got taken off the spy, allowing the woman who killed him to search through his files and discover all his secrets.
      Max: My mask should have blown up the moment that murderous little minx put it on. And it certainly shouldn't have let her snoop around in my junk.
      Murtaugh: So... you walked around for two years with a defective bomb strapped to your face?
      Max: I'm leaving that off my resume.
    • The Gavs have a protocol that says they are supposed to notify their security forces and ask for permission before starting a dangerous experiment. They decide to ignore this protocol because security has "their armored knickers in a bunch." If they had asked, they would have been informed that the reason for the panic was because they were already in a state of emergency, and as such it was a terrible time to start the experiment.
    • Cars in the setting have built-in failsafes that prevent an operator from using the manual controls while intoxicated, emotional, overtired, suffering from Testosterone Poisoning, etc. Automation is so widespread that many of them don't come stock with manual controls at all. Driving under the influence carries the death penalty under the logic that you had to put in quite a bit of perfectly sober prep work in to disable those safety features first.
    • Invoked in order to avert It Only Works Once, the NUSPI has exactly enough fuel to spin up its forward annie-plant array, fire through a wormhole at the target and spin back down. You can fire a second shot using the fuel reserved for spinning back down provided you don't mind the forward annie-plant array flying apart and destroying the ship.
      Ennesby: We had to disable a lot of safeties.
    • Inverted when the Tagon locked Iafa out of the internal gravity because he didn't trust Iafa. This comes to bite him when he needs Iafa to use weaponized gravity to repel boarders but can't because of security reasons.
      Tagon: You are now under orders to unlock it. For security reasons.
      Iafa: It's a hardware thing sir. Unlocking it will take an hour or more.
      Tagon: Get started.
      Iafa: And the hardware is inaccessible to me.
      Tagon: I hate it when we secure the wrong stuff.
  • Subverted in Skin Horse. Genetically-engineered dog Sweetheart has taken refuge from an Alaskan snowstorm inside the team's V-22 aircraft (with an A.I. known as Nick). After cranking up her courage with a bottle of schnapps, Sweetheart is ready to go out and save the mission, only to be stopped by Nick. Due to the fail-safe big orange safety lever, that doesn't stop Sweetheart for long.

    Web Original 
  • Doctor Grordbort's Contrapulatronic Dingus Directory (a mock catalogue of Steampunk rayguns and other non-existent devices of the scientific-romance era) is full of warnings about involuntary sterilization or the loss of "only some limbs" through mishandling of the devices, which is frightfully easy to do due to their needlessly complex controls and lousy human-engineering.
  • In the Mystery Flesh Pit National Park, one of the park failsafes was to inject the Pit superorganism with thousands of gallons of potent muscle toxin as a paralytic/sedative agent in case the organism ever started experiencing spasms dangerous enough to damage park infrastructure. Not only did the failsafe fail to paralyze the flesh pit, the ensuing toxic shock caused even more violent contractions that crushed most of the park and trapped hundreds of people underground, along with inducing vomiting that stunk up the entire county, ensured none of those hundreds made it out alive and ejected several of its unique gut predators into the surface. Worst of all, this partially woke it up, at a time where it's known the Permian Basin Superorganism is perfectly capable of becoming ambulatory.

    Web Video 

    Western Animation 
  • In one episode of Archer, a computer virus infects the mainframe and threatens to upload all the spies' names to the virus' creator. They get the idea to just unplug the mainframe until everything can be sorted out, but it turns out the mainframe has a battery backup. Behind a nearly indestructible locked door. Whose lock is controlled by the mainframe.
  • In Ben 10: Secret of the Omnitrix, the titular device gets messed with in such a way as to cause it to start a countdown to an explosion that will destroy the entire universe. The subversion is that that is the failsafe. The creator figured that destroying the universe itself was better than having the thing fall into the wrong hands. Which raises a number of questions that have never really been resolved.
  • In the Bugs Bunny cartoon Hare Lift, Bugs and Yosemite Sam are aboard a pilot-less aircraft. After an extended argument, Bugs rips out the plane's steering yoke. In response, Sam pushes a button marked "autopilot". A thin, beeping robot then emerges and upon seeing the condition of the plane's controls, immediately grabs one of two parachutes and jumps.
  • In King of the Hill, Peggy goes skydiving, but both her chute and emergency chute fail to deploy. She ends up breaking several bones.
  • Megas XLR:
    • During the first Season Finale, Coop frantically searches his dashboard for a button that will save the world, only to discover that the button actually labeled "Save the World" was marked "Out of Order".
    • A more serious example occurs when Megas' breaks its protonic stabilizer, a part that seems to be the ONLY thing that keeps its reactor core from immediately Going Critical with enough force to destroy a planet. A SCRAM or shutdown of the reactor is apparently impossible, as it's never mentioned as an option.
  • My Little Pony: Friendship Is Magic, season 2, episode 1 almost invokes this by name. Magical chaos is running wild. Twilight Sparkle, having seen this happen before (mainly from her own spells), has developed a failsafe spell for just this sort of occasion, and sees no reason to not use it at the first sign of major trouble. But this time, the source is stronger than she is used to, so...
    Twilight: My failsafe spell... failed.
  • Thomas & Friends adapted The Railway Series example above where Henry's train crashes due to a broken emergency signal.
  • A lot of the crises in Transformers: Rescue Bots could have been avoided if the scientists in Griffin Rock ever bothered to implant failsafes in their tech.

    Real Life 
  • A very very narrow aversion occurred in the 1961 Goldsboro B-52 crash, when a B-52 carrying nuclear bombs crashed near Goldsboro, NC. Nuclear weapons are designed with failsafe systems to ensure that a nuclear detonation doesn't accidentally occur, in this case there were four separate switches on the bombs that needed to be armed in order to trigger a nuclear explosion. When the military bomb disposal team found one of the bombs, they discovered that only a single switch was in safe mode, meaning one circuit was all that stood between safety and an approximate 8 mile crater in North Carolina.
  • In 2017, Wanna Cry — a ransomware botnet — affected more than 20% of hospitals in the UK, later spreading to over 74 countries. The malware was designed to automatically ping an unregistered domain name, and would cease to exist if it found a domain name that actually existed. A twentysomething hacker found this flaw, registered a previously unregistered domain name, and began to intercept the botnet's packets. This registered domain's IP address reached all affected PCs... shutting down the virus. As Malware Tech, a British research firm, described it: "It thought it was in a sandbox [testing environment] and killed itself." Or, to put it another way, it was the cybersecurity equivalent of launching a missile that will only detonate if it hits a preassigned city ... and having the missile technician add "Mojave Desert" into the missile's launch code database.
  • As recounted in the Seconds From Disaster documentary, the Gare de Lyon rail accident in 1988, where an inbound SNCF commuter train with disabled brakes crashed into a stationary outbound train, was the result of a horrible chain of errors and system failures that completely overwhelmed the existing failsafe mechanisms. Almost everything that possibly could have gone wrong, did.
  • The Moorgate Tube crash of 1975 highlighted a major flaw in the London Underground's safety precautions, which were designed to withstand an inexperienced or careless driver engaging the brakes late but not to handle a train running towards the end of the line at full throttle. And for reasons never definitively established, the driver was still gripping the Dead Man's Switch right up until the train hit the end of the tunnel.
  • The Therac-25 radiation therapy machine is now used as an example in engineering textbooks of how not to design a safety-critical system. Through a combination of corporate negligence and incompetent design, it killed or maimed several patients with overdoses of radiation. The machine contained two radiation sources: one with low power for direct use, the other 100 times more powerful to be used only with diffusing hardware. A software module was intended to prevent human error from activating the high-power beam without all its accompanying hardware engaged... but pressing a key at just the right instant would crash that module and the operator would have no idea. Oooops. The Therac-25 disaster is used to demonstrate several basic design principles:
    • Do not reuse existing software after hardware changes.
    • Provide human operators with clear and significant error messages.
    • Do not rely exclusively on software to verify hardware status.
    • Have a clear reporting system for errors and accidents at the corporate and governmental levels.
    • Do not assume your software is flawless. note 
  • Darwin Awards winners/losers often go to extreme lengths to override major failsafes in order to achieve minor objectives. Like this man, who had to try really hard before he could get run over. Or the winner who tried to unjam a woodchipper without turning it off first and Fargo-ified himself.
  • The sinking of the Titanic on 15 April 1912 was so unexpected because of its novel failsafe design, with multiple watertight compartments that should have been able to keep it afloat even if one compartment was breached (of course, multiple compartments were breached in the collision). The bulkheads that were supposed to seal off the compartments, while extending above the waterline, were not sealed at the top, meaning that they could still overflow and fill other compartments, something of a design flaw. And once the front compartments flooded, this accelerated the sinking of the ship as it weighed down the front of the ship, causing the stern to rise into the air and eventually leading to the ship breaking in half from the strain. The fact that all the water was in the front also hindered the pumps meant to help remove water if the ship flooded, which were at the rear. It's now believed that if there had been no compartments and the ship had flooded evenly, the ship would have sunk much more slowly, allowing help to arrive in time. And of course, the final failsafe on any ship—the lifeboats—failed to save most of the passengers, because there weren't enough of them. The Titanic had room to carry enough lifeboats, but it sailed with only one-third of its capacity. Contrary to urban legend, this wasn't because the builders thought the ship was unsinkable and dodged regulations; carrying only a fraction of lifeboat capacity was standard practice at the time, based on assumptions about how slowly passenger ships sank—it was expected that help would arrive before the ship had to be completely evacuated and the lifeboats would simply ferry the passengers to the rescue vessels. The only ships that sailed with enough lifeboats for everybody were warships, which were expected to go down in conditions where they would sink fast. There were regulations on the bare-minimum number of lifeboats to be carried, but they were based around the weight of the ship, not passenger capacity, and the number aboard Titanic as-built was already over that limit. The Titanic just had the poor luck to sink in a situation where the closest ship (the Californian) couldn't receive her distress calls due to the lack of round-the-clock wireless operations, and the second closest (the Carpathia) was too far away to reach the Titanic before it sanknote .
  • The Chernobyl Nuclear Power Plant had all the normal failsafes for its reactor design but operators had deliberately disabled many of them to test a new shutdown procedure.
    • Worse, the operators were trying to increase output because the test wasn't working as anticipated. Even worse than that, the shutdown had been scheduled much earlier in the day but minor problems on the grid delayed it. Thus, the crew performing the test were not the ones who had been briefed on it. They mistakenly reduced power, then over-corrected into a completely untested and unstable reactor mode... then decided to proceed with the test instead of just shutting down.
    • The operator errors were exacerbated by several design features that were clearly less then ideal. For example, the control rods were poorly designed — in an emergency reactor shutdown or "scram", the control rods are dropped "en masse" to block neutron emissions and shut down the chain reaction. Because Chernobyl's control rods had graphite tips (the same material used to moderate the reaction in the first place), the scram caused a sudden power flare before damping the reaction. The Soviet Union was aware of the potential for power spikes, but previous spikes had always been brought under control and the problem never became a priority to fix. By the time the (unfortunately manually operated) scram button was finally pressed at Chernobyl, partial meltdown had already begun. When all of the graphite rod tips entered the chamber at once, the resulting power spike damaged the reactor vessel. The control rods broke off, leaving the reaction-boosting graphite tips lodged in the chamber and the actual control rods jammed and broken, at which point the reactor exploded. However, many other RBMK-design reactors were successfully operated, after modification, for many years after the catastrophe, suggesting that the design issues were not as big a factor in the disaster as human error.
  • Britain's experimental Windscale nuclear reactor was simultaneously an example and an aversion of a Failsafe Failure. The reactor was constructed in order to give Britain parity with the United States in the nuclear arms race, but a combination of modifications to the reactor's operating procedures and incomplete understanding of graphite's response to nuclear bombardment resulted in situations where the reactor would periodically give off spikes of high heat, for which the original temperature monitoring equipment was woefully inadequate. It also lulled the operators into getting accustomed to seeing occasional high temperatures in the normal course of operation. Thus, when the reactor caught fire, it was at first thought nothing was wrong. It continued for three days at temperatures exceeding a thousand degrees centigrade whilst the temperature sensors - located away from the hotspots - reported normal operating conditions. The original design allowed for the uranium fuel cores to be pushed through their channels into a cooling bath, but by the time the fire had been discovered the cores were too hot to move. Not only had they become jammed by heat expansion, they were so hot that metal poles used to try and move them simply melted on contact. After several failed attempts to cool the reactor it was eventually brought under control by flooding the cores with water. However, the disaster could have had far more serious consequences. Windscale was air-cooled. Core temperature was kept under control with a series of fans, and the waste heat was exhausted into the air. On the suggestion of Nobel prize-winning nuclear pioneer Sir John Cockcroft the cooling towers were fitted with expensive, complex air filters, which were originally pooh-poohed on account of the work involved - the towers had already been constructed by the time Cockroft found out about them, and the filters were large, heavy structures that had to be built on top of the towers. As it turned out, the filters prevented the direct release of red-hot nuclear particulates into the environment, although the release of radiation was nonetheless substantial.
  • On the Boeing 747's first flight, the backup batteries that would have powered the hydraulics in case of engine failure failed upon takeoff. Doesn't sound like much until you find out that the newly introduced high bypass engines were very finicky and the engineers had little idea whether they would stall upon takeoff due to the change in the angle of attack in the air (these engines stalled very easily - at the time a tailwind could easily lead to a stall). Engine stall = no power = no hydraulics = no control over flight surfaces = guaranteed crash = 700,000lb bomb loaded with jet fuel. Thus, they strapped on some batteries to power said hydraulics, but the batteries failed. Fortunately the first flight went according to plan. Source: Wide-Body: The Triumph of the 747 by Clive Irving.
  • The tale of The Gimli Glider. On July 22, 1983, a combination of underfueling Air Canada's brand new Boeing 767 and a faulty fuel level sensor led to its pilots not knowing they were low on fuel until they ran out - at 41,000 feet in midair. To top it off, many of the instruments in the cockpit were electronic and designed to be powered by the jet's fuel - meaning that the pilots were flying blind and had no ability to control the aircraft. Fortunately for all concerned, there were a few battery-powered backup systems and a nearby decommissioned landing strip (the former Royal Canadian Air Force Station Gimli), and Captain Pearson happened to be an experienced glider pilot. The Gimli Glider managed to land safely (though it blew out a few tires on the landing gear and skidded to a stop on its nose) with no fatalites and only minor injuries. The Gimli Glider was repaired and flew for 25 more years until it was decommissioned in 2008. One failsafe didn't fail: the 767's Ram Air Turbine, basically a small windmill emergency generator, deployed automatically and provided minimal power to the instruments until they finally lost too much airspeed before landing. That said, there were a few issues with the evacuation slides, which would actually be a minor example of this trope (the collapsed nosegear put the entire plane on an angle such that the rear evacuation slides didn't quite reach the ground, leading to some of the aforementioned minor injuries).
  • This was an issue that arose with aircraft hydraulic systems in the 1980s. All planes had three separate hydraulic lines, specifically so that if one of them was breached, the other two would prevent a total loss of flight controls. However, this system relied on the plane having at least one intact hydraulic system; designers did not believe it was possible for all three lines to fail (at least not without a level of catastrophic damage to the aircraft that would render the issue moot anyway), so this appeared to be acceptable. That is, until two crashes in 1985 and 1989 proved that the system had a weakness: because all three lines clustered together at the tail, significant damage in that area could, in fact, breach all three hydraulic lines at once, such that two situations of potentially manageable failures (a bulkhead and an engine respectively) instead ended in catastrophic accidents with a combined death toll of 632 (and it's only thanks to some extremely skillful flying in the latter case that that number isn't even higher). Following the second accident, all aircraft were fitted with an additional backup system to account for what had previously been considered an impossible failure; now if the hydraulic lines are breached, a mechanism seals the line ahead of the breach to prevent a total loss of hydraulic fluid.
  • On Aug. 18, 2003, Hitoshi Nikaidoh, a surgical resident at St. Joseph Cristus Hospital in Houston, Texas, was decapitated by an elevator with faulty door failsafes. The car was supposed to be "out of order", but some jerk removed the sign. Did anybody think to cut the elevator's power?
  • Pressurized or liquid gas cylinders are nasty things if not treated nicely. There are very good reasons why cylinders have pressure release valves and rupture disks, and why they shouldn't be diked out. As MythBusters demonstrated, a gas cylinder can punch a nice, clean hole through a cinder block wall (they built one for the test). In the process, the wall was shoved back noticeably and the wall behind said cinder block wall was nearly punched through itself.
  • In the design of the space shuttle Challenger, the joints between booster rockets segments were sealed by two thin rubber o-rings, the second ring intended as a failsafe if the first ring failed to seal. The SRB design itself wasn't flawed; it worked fine — so long as you maintained mission parameters, didn't try to launch on a day below the recommended operating temperature, and didn't reuse parts that were obviously deteriorating. The Morton-Thiokol engineers who designed the SRB knew this and objected to NASA and their own upper management overriding their recommendations. Thanks to near-freezing temperatures at the January 28, 1986 launch, both rings failed to seal and were vaporized. Then there was nothing to stop the rocket's flame from burning away one of the booster's support brackets, the unsecured booster smashing nose-first into the external fuel tank, and the fuel tank rupturing from aerodynamic forces when the insufficiently secured booster smashed its nose into the tank. Also, pressure suits and the cockpit ejector seats had been discarded after the first few missions since crews of seven couldn't be ejected during launch. In the two-deck orbiter design only the pilot and commander could have ejected, and possibly the two crew seated behind. Because they were a deck down, other passengers (in Challenger's case, including teacher-in-space Christa McAuliffe) would have died anyway. By contrast the earlier Apollo and Mercury launchers were both equipped with escape towers — separate rocket systems that pull the crew capsule off of and away from a faulty booster rocket — while the side-by-side Gemini design allowed for ejection. And there have been two different Cosmonaut crews saved from certain death by the escape towers launching their Soyuz capsules away from an exploding rocket.
    • The O-rings weren't even designed to act as failsafes for the rocket boosters. It was discovered that the stress of ignition tended to bend the joints between two segments of the booster, however the O-rings, heated by the igniting rocket fuel inside, would expand enough to cover the holes. As it seemed to work just fine, the design wasn't changed. However being as the lauch site was in Florida, no one thought to take into account the extreme cold's effect on the rubber o-rings. In fact the Morton-Thiokol engineers specifically told NASA on the morning of the launch that they had no idea how the cold would effect the O-rings (as the launch had already been delayed several times, NASA put pressure on them to give the okay). So in a cascading series of improbable events, the joint bent under the stress of ignition while the O-rings, which had frozen overnight, failed to heat and expand fast enough to seal the gap and were instead vaporized.
    • The Soviet Buran shuttle was launched some two years after the Challenger catastrophe, but included the ejection seats for the whole crew right from the start of the design process. In fact Soviet designers have long (and quite vocally) criticized the Space Shuttle cockpit layout, calling it a throwback to the WWII era bomber cockpits, and made a point of putting all crew seats on the same deck. Had the Shuttle used the same layout, at least several crewmembers could've possibly been saved.
  • The crew of Soyuz 11 died from a leaking pressure valve during reentry. During descent, the explosive bolts which attached the service module to the descent module were fired simultaneously instead of sequentially. This damaged the pressure valve which was supposed to equalize pressure inside the module once they entered Earth's atmosphere, causing it to open while the module was still in space. There was a manual override for the valve, but it was located underneath the seats making it almost impossible to find in an emergency. Later a cosmonaut on the ground attempted to close the valve himself and found it took over a minute to do so, while the biometric sensors on one of the cosmonauts showed 40 seconds had elapsed between loss of pressure and death, in reality oxygen deprivation would have disabled the cosmonauts in far less time, approximately 15-20 seconds.
  • Apollo 1 and Liberty Bell 7. The latter case came first.
    • After Virgil Grissom's Mercury capsule splashed down, the explosive bolts on the hatch (which were there for emergency egress purposes) went off, allowing water to rush into the tiny capsule and sink it. Grissom was very much in danger of drowning, as the crews of the recovery helicopters were trained too well and had to realize the capsule was a lost cause before finally realizing the astronaut was also having difficulty staying afloat and rescuing him. After losing a spacecraft, NASA decided explosive bolts were a bad idea, and did not incorporate them into subsequent designs, instead opting for doors which required much more deliberate effort to open. On the Apollo 1 spacecraft, the door opened inward, which, ironically, was a safety measure.note  While the crew of Apollo 1 was doing a test on the ground, a fire started in the capsule. The atmosphere inside was pure oxygen at greater than sea level pressure (in case you flunked chemistry, pure oxygen powers any flame), and NASA had also managed to put all kinds of flammable materials in the cockpit. As the fire rapidly grew, the pressure grew inside the cockpit, making it impossible for any human being to open the inward-opening door, which, due to the lack of explosive bolts, could not be blown open. Smoke and fire turned the cockpit into a fiery tomb from which there was no escape, and all three astronauts died. One of these astronauts was Virgil Grissom.
    • The fact that the cabin was pressurized to about 2 atmospheres, to reflect net outward pressure the capsule would experience in space, was also a major contributory factor. It meant there was both far more oxygen available to accelerate the fire and that an inward opening door was simply physically impossible to open until the oxygen pressure was reduced. Just as there were no emergency bolts, there was also no means for depressurizing in time. There did exist a means for depressurizing rapidly, called the cabin repress valve, there was just no times to use it. According to the Apollo One Fire Timeline, the crew noticed the fire (by verbal report) at 23:31:04.7. The Command Module ruptured due to pressure at 23:31:19.4, less than 15 seconds later. During the investigation, they determined that had the repress valve been opened, it would have delayed the rupture by about one second. Even if they could have instantly flushed the atmosphere, the interior surfaces had foam padding that was going to be removed prior to an actual launch protecting bulkheads and side panels from being scuffed and dinged during ground testing. After having been soak in 2 atmosphere pure oxygen for 3+ hours, the foam would have burned like napalm in hard vacuum.
  • A similar story happened in the Soviet program too, but wasn't really a case of the failsafe failure — only the crew error. The cosmonaut on a week-long test in an oxygen chamber decided to brew himself some tea and turned on a hot plate. As he was scheduled to have some medical tests taken that day, he needed to clean and disinfect the electrodes' attachment points on his skin, which he did with an alcohol-soaked cotton swab, which he then proceeded to unthinkingly throw in the general direction of the garbage bin. Unfortunately the swab landed right onto the hot plate, and in a pure oxygen atmosphere of a chamber combusted immediately, starting a major fire. Due to the design of the chamber door it was opened only some 20 minutes later, when the cosmonaut in question, Valentin Bondarenko, already sustained third degree burns, from which he died a couple days later.
  • The Apollo 13 Failsafe Failure was even more spectacular (the fact that NASA managed to bring the command module home with all three men alive is often considered NASA's Moment of Awesome). It was essentially a whole series of Failsafe Failures.
    • The faulty oxygen tank on Apollo 13 was previously installed on Apollo 10, but was removed and sent back to the factory because of a design change. It was jarred during removal because someone forgot to remove a screw that was holding it in place, causing it to be pulled up a few inches and dropped by a machine arm. This knocked the drainage tube in the tank (which was also part of the electronic level gauge) out of alignment, which prevented the gauge from properly indicating when the tank was empty. The tank contained an electrical heating coil that could be turned on to heat the oxygen inside for use in flight, and a pair of fans that were used to stir the contents of the tank to get an accurate level reading in ballistic (zero-g) flight. During a test run, because the gauge wasn't working properly, technicians believed the tank wasn't draining so they turned the heater on. There were 2 thermostats on the heater which should have opened and broken the electrical circuit if it got too hot, but the tank circuits were designed to run on 28 volts, provided by the on-board fuel cells during flight. Since this was a ground test, the heater was powered by the ground support equipment which runs at 65 volts. The thermostats were rated for 30 volts maximum, and the too-high voltage welded the contacts shut. There was also a human watching an outside temperature gauge that registered the heat inside the tank, but the gauge was only designed to go up to 80 degrees Fahrenheit (which was about 200 degrees hotter than the sub-zero oxygen would be stored in). Since the needle never went above 80, he didn't realize the tank was getting up to 1,000 degrees inside, and as a result the insulation coating the electrical wires inside the tank melted... which left them vulnerable to short-circuit and sparking. Apollo 13 was launched on schedule with the faulty tank, and four days into the flight, they flipped on the stirring fans in the tank, the damaged wiring sparked and ignited the tattered remains of insulation, and fire inside the tank promptly exploded it.
    • To make matters even worse, the Apollo craft carried two oxygen tanks for extra safety, but they shared some plumbing, and when tank #2 exploded, it took several critical parts of tank #1's plumbing with it. Result: the oxygen in both tanks was soon lost, and the astronauts inside would have died in a few hours if they hadn't been carrying a healthy lunar module with its own oxygen supply.
    • At least the nuclear fuel rod cask was fine. A miniature nuclear pile was built to power some instruments that were to be left permanently on the moon, but just in case the mission never got to the moon, a ceramic cask was built to contain the nuclear fuel, and it was designed to survive a fiery reentry to Earth... just in case. And survive it did.
      • The SNAP-27 carried on Apollo 13 (and Apollos 12, 14, 15, 16, and 17) was a Radioisotope Thermoelectric Generator, essentially an atomic battery, not a reactor (there's no chain reaction fission going on, that's the clue). It basically turns the heat of spontaneous radioactive decay into electricity via thermocouples. It is similar in design to the RTGs carried on Pioneer, Voyager, Cassini, and the New Horizons probes. They've also been used to power lighthouses, Antarctic science experiments, and anywhere you need a decade's reliable power source. It was probably one of the most reliable components flying on Apollo 13. Even if the cask had ruptured during reentry, the plutonium inside would have remained intact and ended up at the bottom of the same 20,000 ft. deep trench, doing absolutely no harm to anyone for the next 5,000 years.
  • The Ariane 5 rocket's maiden flight: The navigation system had two redundant computers to handle hardware failure, but because a software error happened instead, both computers crashed simultaneously. The consequences theatened to break the rocket apart, but were successfully caught by another failsafe... the Range Safety system.
  • As shown on MythBusters: plugged safety valve on water heater + thermostat failure = steam-powered ballistic missile. As reported all over the news, one such incident occurred in a strip mall in Burien, WA on July 28, 2001.
    • Part of the start up procedure for an industrial or commercial boiler (a water heater is technically considered a type of boiler) involves getting the pressure (steam) or temperature (water) high enough to make the safety valves lift. If they don't, you shut it down. The things that hold them closed are set to lift automatically at the maximum working pressure/temperature. Broken gauges and sensors can still result in one exploding during startup if the valves (which don't require power) are broken because it won't shut down automatically and the person doing the startup won't know that there is a problem until it is too late.
  • The original DC-10 airliner cargo door fault that caused the crash of Turkish Airlines Flight 981 in 1974. Firstly, the cargo door opened outwards, as opposed to inwards, in order to maximize cargo capacity. This meant that air pressure inside the plane would naturally try to force it open, requiring a complex set of locking hinges and pins to keep it closed. Secondly, the door handle was supposed to be impossible to close unless all the pins were safely latched, but in practice, if enough force was applied to the handle, the internal mechanisms would bend out of shape without latching. So, the door could still appear to be closed and locked even when it wasn't. Thirdly, warning placards to inform the ground crew of the potential problem were installed, but they were only in Englishnote , which most ground crews around the world couldn't read. And finally, when the door blew out, the pressure change collapsed the cabin floor and severed all of the aircraft's control lines, including the redundant backups, rendering the pilots helpless. Airliners have floor vents to prevent such a catastrophic failure, and the DC-10 DID have them — they simply were too small and the pressure in the cabin and in the cargo compartment didn't equalize fast enough to prevent the floor from collapsing.
    • And the reason that there were even warning placards in the first place? In 1972 there was a American Airlines flight that had the exact same problem, but that one got a small but significant lucky break: not all of the control lines were severed, so the pilots retained some level of control and managed to land the plane with no loss of life. note  Because the only way that the FAA could force McDonnell Douglas to fix the planes was to ground them all and not let them fly before the door was repaired, there was a gentlemen's agreement between the head of the FAA and McDonnell Douglas to put in this "failsafe" rather than fix the fundemental issue.
  • Modern Formula One cars have anti-stall systems in the engine management computer. These are very useful as long as they don't go off accidentally on the starting grid and put the car into neutral when it ideally should be in first. This is more embarrassing than dangerous though.
    • The system is capable of forcing the car to continue moving when the driver attempts to stop. This caused test driver María de Villota to crash into a stationary truck, suffering serious injuries that may have led to her death 18 months later.
  • HMS Ark Royal sank after being hit by a torpedo that, among other things, caused flooding that shut down the boiler which powered the emergency pumps and all the electrical generators, the ship having been built without dedicated emergency generators separate from the main system. Oops. (Poorly engineered and inadequate electrical systems were a "feature" of all Royal Navy ships of the period because their urgent need to rearm arose right when the Great Depression was making R&D funds hard to come by.)
    • USS Enterprise (the fifth one, CV-6) had a steering engine breakdown in the middle of one of the carrier battles for Guadalcanal, jamming the rudder into a hard turn. Fortunately the crew had rigged an emergency steering engine in case this very thing happened. Unfortunately, everyone in the compartment with the backup was knocked out by toxic gas released from nearby fires. It took nearly thirty minutes for someone to reach the compartment, and before he could turn on the backup motor he passed out as well; fortunately he came to fifteen minutes later and managed to turn on the backup motor. While all this was going on, another Japanese air raid was detected approaching but turned away fifty miles out.
    • The Japanese carriers at Midway had their emergency generators for the firefighting system just off the upper hanger deck, about at the midships line. This placed them on the same deck where any bomb that struck the carrier would probably explode, at about the spot enemy pilots would use as aiming point. At least two of them probably lost the backup generators to shrapnel from exploding bombs.
      • This was a problem that most Japanese ships had during WWII, particularly their carriers. They had very intricate and sophisticated damage control systems, but they were very valuable; A single hit could knock out the entire system. This was in contrast to USN ships, which had multiple reduatncies; i.e if some part of the ship went "Boom!" there was usually 2-3 other stations around the ship where the systems could be activated.
    • Similarly, the British Navy during World War I had one of the safest and most efficient systems of transferring explosives from turrets to magazines. Unfortunately, Admiral David Beatty of the Battle Cruiser Fleet thought that they weren't efficient enough, and so decided not to use them, unlike his superior and commander of the Grand Fleet Admiral Jellicoe. The result? At the Battle of Jutland, both the battlecruisers of the Battle Cruiser Fleet and the dreadnoughts of the Grand Fleet sustained similar hits. But while the dreadnoughts stood up beautifully, firing back and damaging several of their German counterparts so badly that they were effectively forced out the war, three battlecruisers exploded in as many minutes.
  • Inversion: Electrical codes require failsafe protection (fuses or circuit breakers, for example) to be on all circuits, to stop the current flow in the circuit when the wire gets hot enough to possibly catch on fire. Aspiring electricians will have the failsafe rules for preventing electrical fires hammered into their heads repeatedly (electrical fires being as much as if not more of a danger than electrical shock). So it is jarring at first, to learn that circuits for fire pumps MUST NOT have any fuses or circuit breakers of any kind. Why? If the fire pump is running, it is assumed there is already a fire, and a fuse or breaker breaking the circuit (and shutting off the pump) isn't going to improve the situation.
    • Probably applicable only to the American grids, which have a peculiar system where the neutral wire is isolated from the ground. European grids have the neutral grounded, so short circuits do not usually propagate for a large distance, making this requirement somewhat irrelevant.
      • One result of this is that some American three-phase electrical outlets have up to five electrical contacts, one for each of the three alternating-current phases, one for the un-grounded neutral wire, and one for the ground-wire.
    • The grounding wire is a fail safe that's connected usually to the chassis of the object in the event that short circuits dump to it and not across your hand or anything contacting it. However, you can "disable it" by getting a three-prong to two-prong adapter (ironically the two-prong adapter has a tab so you can connect a grounding wire to it... but nobody does anyway).
    • Zinsco brand circuit breakers, installed in countless homes from the 1950s to the 1980s, were infamous for spontaneously arc-welding themselves into the "on" position, leading to thermal runaway and structural fire in the event of an overload or short. Federal Pacific Electric breakers were similarly non-UL-compliant and could randomly fail to trip after years of seemingly smooth operation.
  • One of the main causes of the Three Mile Island nuclear accident was a pressure relief valve sticking open. At first the dangerous pressure is relieved — and then the coolant keeps escaping through the stuck-open valve.
    • Which the operators would've noticed, if the indicator light had been connected to the valve itself rather than the switch that controlled the valve.
    • Adding to the problems there, the plant was being operated with several alarm lights permanently locked on due to some type of failure in the system causing the alarms to read false, and instead of fixing the issue, they just ignored them. So when the things which those alarms were supposed to be monitoring actually reached the alarm point... no-one knew.
    • Ironically, the manager on duty at the time of the accident had gone to see a movie the night before... The China Syndrome, which is about safety coverups at a nuclear power plant, complete with a near-meltdown situation.
    • Another problem was the control room was equipped with more than 120 separate dials, alarms, and gauges, making it very difficult to isolate the root cause when the accident triggered virtually all of them at once.
  • The SL-1 reactor, site of the only fatality directly caused by a nuclear incident in the US. It was built for and run by the US Army as a prototype for a small, semi-portable reactor to power mobile command centers. A technician was performing a maintenance test on it while it was shut down. Said test required him to manually elevate the reactor's only control rod a few inches. He raised it up almost 2 feet. The reactor became instantly active and went prompt criticalnote , the sudden power spike caused the water in the reactor to superheat and flash to steam, and the pressure surge ejected the control rod, which impaled the technician on the roof of the compartment. Luckily the other failsafes that weren't violated/ignored to do this kicked in and shut down the reactor, but not before the other two people at the site were killed by the explosion - while also receiving enough radiation to require all three to be buried in lead-lined coffins entombed in cement.
  • The Deepwater Horizon oil spill in the Gulf of Mexico occurred because the blowout preventer, a supposedly idiot-proof device that seals the pipe in the event of something like, say, a rig explosion, failed. It turns out that the device had been tampered with (one of the rams that would have sealed the pipe was taken out to make room for some kind of monitoring equipment, amongst other things) but it's still a great example.
    • It gets better. That was the backup device. Someone noticed a problem with the primary during some tests, hence Transocean fitted the monitoring kit and said "Oh, don't worry, the backup will take care of it." Predictably, when it was called upon, the backup failed.
      • It still gets even better - there has been a rumor that the alarm was turned off so false alarms wouldn't wake people up. No wonder 11 people died.
    • BP's safety record is one of the worst in that regard, in that disabling the failsafes and monitors to increase productivity seems to be SOP for the company. For example, the earlier Texas City Refinery Explosion occurred partly because someone disabled an overflow alarm, which, when the other one broke, started a chain reaction that killed 17 people.
      • TransOcean has an even worse safety record, actually citing the year of the spill as their best year ever and their least number of accidents.
  • The Big Bayou Canot train wreck of 1993 happened because a barge struck a railroad bridge hard enough to distort and displace the tracks, but not hard enough to actually break them, which would have set off warning signals and stopped the train.
  • Cancer is an example of a failsafe failure of a failsafe failure of a failsafe failure of a failsafe failure. Precancerous cells are a normal occurrence in the human body due to imperfections in DNA replication. Fortunately human cells have proliferation control mechanisms, though failures in these systems can cause inappropriate cell division. On top of this, cells have other control mechanisms such as cell contact signals that stop cell division, cell survival signals, immuno-detection of cancerous cells, telomeres limiting the number of cell replications. These other failsafes are overcome by natural selection and the law of large numbers though, further compounding mutations during DNA replication in subsequent generations of the cell line.
  • Alzheimer's, meanwhile, is to some extent a problem because the failsafe works too well: the brain is equipped with so many tiers of redundancies and backups that it can suffer a huge amount of neural degradation before the person's everyday performance is noticeably affected by it - but this means that by the time the symptoms are obvious, it's because everything that the brain can do to try preventing Alzheimer's from happening already has been done and the battle is mostly lost. In practice this is less of an issue because, at least for now, even if Alzheimer's is caught early, there's very little that can be done to alter the outcome or even slow the progression.
  • The fantastically elaborate design of the Stuxnet worm managed to override every safety system used to ensure that the gas centrifuges at the Natanz nuclear facility couldn't malfunction. The whole system, from the Windows operating system of the controlling workstation to the SCADA PLD controlling the speed of the centrifuges was essentially taken over by the worm. The worm even managed to make the SCADA system "lie" to the computer connected to it by playing back the data from a normal run, à la Speed, as the PLD caused the centrifuges to spin out of control. This was caused deliberately of course, but it shows that human ingenuity, as well as human stupidity, can override failsafe systems.
    • Which is the reason why any reactor is not in any way connected to the outside world other than maybe a simple telephone (which is a separate entity as well to be extra sure).
    • All thanks to USB drives and the unknown genius at Microsoft who thought "let's allow plug and play media to run programs as soon as they're inserted without the user knowing" that's not a problem.
      • While that was a dumb design flaw in Windows (since fixed), ask yourself why computers in that sensitive facility even had USB ports? The ports should have been plugged, or otherwise physically disconnected... except those ports were needed to accept the PLC program code which was then transferred to the SCADA units themselves. The SCADA units were physically airgapped but needed a way to receive programming. The worm tampered with the PLC code on the Windows computer before it was sent to the USB drives and from there to the SCADA units. Thus Stuxnet becomes one of the rarest types of malware: able to use Sneakernet to jump an air gap.
    • The Stuxnet worm is a bit of a special case, as it was not only designed to cause the system to fail but, well, TV Tropes is not necessarily saying it was designed with the help of the company who built the centrifuges...
  • The failsafes in the Fukushima Daiichi (Fukushima I, that's a Roman numeral one) nuclear power plant worked as intended after the 11 March 2011 earthquake in Japan, safely stopping all three of its working reactors. But then the tsunami came and washed out all the emergency diesel generators poorly placed right at the shore, and the plant's connection to the grid was severed by the quake. So the plant lost cooling at all of its six reactors, which led to the successive meltdowns of at least three of them.
    • What happened to the diesel generators is known as a "common mode failure" in engineering circles, and it's one of the hardest hazards to anticipate and prepare for.
      • The proposed Molten Salt Reactor design uses fuel that has to be in a melted state for the reactor to work, and keeps the core from draining by constantly cooling a flattened section of pipe to keep a plug of salt frozen so that if power to the reactor's building is lost, the plug of salt melts and the fuel-salt drains into a passively-cooled drain-tank that's configured to prevent nuclear fission from happening, averting this trope in the simplest possible way: powering the safety system entirely by the force of gravity.
    • Even that might have been dealt with because of another safety measure designed in: the reactor power system had the capability of getting electricity from truck-mounted mobile generators, which were, in fact, on scene within a few hours. Only problem was that no one had verified that the generators and the power system they were supposed to supply emergency power for had compatible attachments for the power cables.
  • The Russian submarine Kursk sank due to one such design flaw: a faulty torpedo was able to leak hydrogen peroxide, which proceeded to react with the torpedo casing, causing the first explosion which then set off the fuel and munitions in the torpedo bay. There are arguably several flaws in the design that let this happen (starting with explosions ideally not being the immediate consequence of a leaky pipe).
    • Even this would not have been a problem had there not been a ventilation duct through the bulkhead. The bulkhead itself was strong enough to contain the explosion of the first torpedo (which itself had not had its welds tested since it was only a dud and thus had no warhead), but the vent was not, and thus allowed the blast to get through, and incapacitate everyone in the command room in the second compartment. The torpedo tube door itself should also have contained the blast, but it was a common issue that the doors often required several tries to properly close due to bad contacts. Finally the emergency buoy which should have automatically released in the event of a catastrophic failure was disabled after concerns during a previous mission in the Mediterranean that it would trigger accidentally and give away the submarine's position.
    • Almost exactly the same accident befell HMS Sidon in 1955 (except that the rest of the crew were able to evacuate), and resulted in the British Navy dropping that torpedo design several decades earlier. Russia was basically either lucky or unlucky enough (depending on your point of view) to avoid similar incidents during the 20th century.
  • The power outage during Super Bowl XLVII was ironically caused by the failsafe itself. A power relay, which was supposed to activate and relay power from another source in case of an outage, activated when it wasn't supposed to, causing a partial blackout in the stadium.
  • Stories like these are generally mass circulated in maintenance circles as a reminder for safety and reason for preemptive inspection. Always verify the numerous safety pins and indicators in an ejector seat before even touching it... because a system capable of sending a grown man eighty feet in the air in under two seconds doesn't leave much behind if the aircraft is inside a hanger with a forty-foot ceiling. Overpressurizing an aircraft system has caused an aircraft to crack in half. Altering the bell mouth of a fuel manifold so it doesn't knock against a back nut for a fuselage panel... that was preventing it from forming a vacuum seal and caving in the manifold.
  • sudo (in POSIX style operating systems) and UAC (in Windows) is a failsafe, sort of, to make the user aware that whatever they're going to do may have an impact on the system. When used, most programs that try to do an action that will cause system changes will trigger this. Bypassing it, by either running as root or disabling UAC (or even leaving the latter at its default "recommended" setting), will allow a program to do whatever it wants.
    • Similarly Android has on its security optionsnote  a checkbox to allow to install app packages bypassing the Google Playstore. It's desactivated by default and if you check it, it will warn you of possible damages -so have fun if that apk file that supposedly had a game contains also something much nastier-.
  • An example of a double failsafe failure. After several tragic mid-air collisions (most notably over India in November 1996, claiming 349 lives) a system was introduced to prevent such collisions from happening: all airliners were equipped with TCAS (Traffic and Collision Avoidance System), detecting other aircraft on a collision course and telling the crew if they should climb or descend to avoid a crash. Then, in January 2001, two Japan Airlines aircraft with a total of 677 lives onboard nearly collided over Suruga Bay, even though both of them were equipped with properly working TCAS - because at the same time, one of the crews received commands from the ATC controller to avoid a collision, and those ATC commands were contradictory with the TCAS commands, and that plane followed ATC and disregarded TCAS, while the other plane did not receive ATC instructions and therefore followed TCAS; it was only thanks to one pilot's quick judgment and subsequent implementation of evasive action that a collision was averted, and the two aircraft still missed each other by just 35 feet (11 metres). The Japanese investigation commission asked the ICAO (International Civil Aviation Organization) to establish a clear set of rules whose commands have a priority in such situation. For some reason though, the ICAO ignored the pleas and seventeen months later, two aircraft collided over southern Germany for exactly the same reasons, killing 70. Only then were the TCAS commands were given absolute priority.
  • Over the course of WWII, the Germans attempted to add failsafes to the bombs they dropped on England in order to create more and more complicated Wire Dilemma situations that would hopefully kill British bomb disposal technicians. Occasional instances of this trope allowed the British to safely disassemble the new bomb types and figure out how the failsafes worked and from there how to disarm the bombs when they weren't broken. Of course, in this case, the feature that failed was a fail-deadly, the failure of which caused the bomb to fail-safe, which is to say the bomb didn't work.
  • Probably the worst known roller coaster incident in history occurred on the Battersea Funfair Big Dipper on 30 May 1972. The lift chain malfunctioned, followed by the anti-rollback mechanism, and the train rolled back to the station and collided with the other train. Five children died and another 13 were injured. The park struggled on for another 16 months, before closing at the end of the 1974 season.
  • The Smiler roller coaster at Alton Towers​ suffered a bad incident (although not as bad as the Battersea incident) on 2 June 2015; the operator, following standard procedure, sent a test train around the track, but assumed that it had completed the circuit instead of checking that it had. He then sent a passenger train, and the ride safety systems detected the impending collision and shut down the ride — and the operator assumed that this was a malfunction in the safety systems, and manually restarted the ride without doing any further checking, probably using a key he wasn't supposed to have.
  • Class D cargo holds were designed to be airtight in order to starve cargo fires of oxygen, preventing them from bringing down a plane. However, in the case of ValuJet Flight 592, the fire was caused by incorrectly declared, unsafely packaged oxygen generatorsnote , resulting in a self-sustaining inferno that brought the DC-9 down within minutes, killing all 110 occupants. Class D holds were discontinued after the accident because the FAA realised how useless hoping a cargo fire would peter out mid-flight was.
  • Automatic activation devices are lifesaving devices in skydiving, and they are intended to automatically deploy the reserve parachute if the main has failed or is not opened and flying at certain altitude. They have saved hundreds of lives, but sometimes they can fail disastrously. As they register the air pressure changes and acceleration, they can sometimes fire unintentionally if the skydiver tries something daring at very low altitudes, resulting in both main and reserve canopies flying simultaneously. There are three possible malfunctions at this point: biplane, where the canopies are one above each other, side-by-side where the canopies are aside each other and downplane where they are vertically aside each other, creating no lift. Biplane and side-by-side are mere nasty nuisances, but the downplane malfunction is likely to be fatal unless the skydiver a) has enough altitude and b) manages to perform immediately main cutaway. And pray...
  • This is believed to have contributed to the Hinton train collision in Canada in 1986. A CN freight train ran a red signal and collided head-on with a passenger train, killing 23 people, including the crew in the engine. While the lead locomotive was equipped with a "dead man's pedal," the subsequent investigation found that it was common practice for CN crew to keep the pedal depressed with a heavy object so they didn't need to keep their feet on it. The engineer was found to have a number of health problems that put him at high risk for a heart attack or stroke. It also didn't help that the crew was also suffering from a severe lack of sleep due to the shifting train schedules. One possible explanation is that the engineer was incapacitated and with the dead man's pedal depressed, the train kept running when it should have stopped. Ironically, the second engine had a newer reset safety control that didn't require engineers to keep their feet on the pedal, but it wasn't used because the cab wasn't as comfortable. After the accident, the railroad industry moved toward these safer controls.
  • Modern internal combustion engines with multi-gear transmissions have at least two failsafes to protect the engine, transmission and/or drivetrain from catastrophic overspeed failures. The first is a governer that automatically shuts off the fuel flow at or slightly above redline. The second is a set of interlocks in the transmission meant to prevent gear changes from placing the engine, transmission, and/or drivetrain into a state that will either severely damage or destroy any part of them. These are not new technologies. The former has been around since the 1940s (early gas turbines had them) and maybe even earlier. The latter was a standard feature of mid-80s Honda 5-speed manual transmissions (5th to reverse wasn't possible without stopping at neutral along the way). If both fail you probably just totaled your car.
  • The Hertfordshire Oil Storage Terminal fire in 2005 is believed to have been caused by a gauge malfunctioning, preventing the computer controlling the pumps from realising that a tank was full to overflowing and allowing petrol to pour out through the ventilators in the roof (another failsafe, ironically, designed to allow fuel vapour to disperse safely) until it found a source of ignition. This leak might have been noticed before it became disastrous if the failure hadn't occurred at 6AM on a Sunday, when there was only a skeleton crew on site, although the timing also meant that nobody was in several nearby industrial and office buildings that were trashed when the first explosion occurred. Either way, the end result was a fire that took three-quarters of the Hertfordshire county fire brigade two days to bring under control, property damage in the hundreds of millions and no fatalities.
  • Qantas Flight 72 is probably the ultimate example of a failsafe failure, where the failsafe didn't just contribute to or fail to prevent a disaster, but single-handedly caused one. A single line of corrupted code caused the plane's computer to erroneously believe the plane was flying at an incredibly high angle and therefore in danger of stalling, triggering two different failsafes (one related to angle, the other to stall prevention), which brought the plane's nose down ten degrees between them. However, the plane was in reality flying level, so the ten-degree pitch-down sent the plane into a dangerous dive. Fortunately, the pilots were able to regain control before the plane crashed, but the sudden change in gravitational force meant that anything and anyone that wasn't strapped down or similarly restrained was thrown around, which resulted in dozens of injuries.
  • Flaws in anti-stalling software were determined to be the cause of the 737 MAX crashes in early 2019; a glitch while the planes were taking off would record the plane as stalling and force a correction, which the crew would try to re-correct causing a back and forth struggle that ultimately caused the planes to spiral out of control and crash. To make matters worse, there was indication that Boeing knew of the faulty software but pushed the MAX out anyway, getting them in further hot water when the plane was grounded for the rest of the year.
  • A RAID system with multiple hard drives is supposed to guard against hard drive failures by using redundant drives, but if all the drives are purchased from the same manufacturer that happens to have a bad batch, all of the drives can fail simultaneously. This happened to This Very Wiki in 2020.
    • Also played straight if RAID 1 (all drives stores a copy of the data) is naively used as a data backup solution. The thought is since RAID 1 makes another copy, and one of the pillars of having a data backup solution is multiple copies of that data, RAID 1 fulfills this role. Except the purpose of doing data backups is to be able to roll back to a certain point in time. RAID 1 does not fulfill this requirement if you delete a file you actually need later or if ransomware encrypts the data (said encryption will happily carry over to all copies).
  • Backups can be a lifesaver in case of drive failure, but if the backup media is bad or automatic backups are failing without the I.T. department knowing, in case of data loss, it's game over when the backup can't be restored. Or you fail to make offsite backups and your office space burns down or a pipe breaks over your data center.
  • Helios Airways flight 522, which depressurized and became a ghost plane because the Boeing 737’s pressure system was switched from automatic to manual so the ground crew could check out a potential problem and the mechanics failed to change the setting back to automatic afterwards. The pilots subsequently overlooked the incorrect setting three times during checklists. The other failure was the low cabin pressure alarm; it sounded just like the takeoff configuration alarm, which can only sound on the ground, so the pilots assumed there was a glitch. The ground controller asked about the pressure switch, but hypoxia had set in too quickly and the pilots were already too impaired to check it. The plane later ran out of fuel and crashed.
  • Exaggerated in the case of an incident that took a chunk of the Internet down for most of a day in April 2021. A very large server farm owned by hosting provider WebNX suffered a power failure, causing the onsite backup generators to automatically kick in... but, for reasons that have yet to be determined at time of writing, one of them suffered a truly spectacular mechanical failure and set itself on fire.
  • The fire in 1980 at the MGM Grand in Las Vegas. The fire took out the automatic fire alarm that would have warned guests of the fire. The back up system for the fire alarm was the public adress system as well as the phone system that could be used to call guests in their room. However, the smoke from the fire made going to the room where those things could be control from impossible.
  • Modern computer hardware and software typically has something to handle things that result in an error. For example, if you attempt to divided by zero, the CPU will raise a fault condition. The software is then supposed to handle this. However, this calls a software routine which, depending on how it was coded, can cause another fault (a double fault). But this is also dependent on there being a valid software routine, so if that runs into another fault, the CPU just goes "screw it" and resets the entire system.
  • Lots of r/TalesFromTechSupport stories involve this trope, usually the result of idiotic or awful management, incompetent/lazy contractors, or other assorted morons not properly investing in or using fail-safes for data, servers, etc. Inevitably things go south and the failsafes fail (or worse), and quite often this teaches management the value of backups and failsafes (that, or they continue to ignore it anyway).
  • The Grenfell Tower fire was a result of one part this trope to one part Cutting Corners. Normally, when a fire breaks out in similar tower blocks, the fire service advises residents to shelter in place rather than evacuate: The dividing walls and floors between individual apartments are concrete and all main doors are made of flame-retardant material rated for at least thirty minutes before burning through, more than enough time for fire crews to arrive and bring it under control, and this is generally considered safer than trying to get hundreds of people down the stairs to the ground floor all at once. However, what was not anticipated by local fire codes was that when the building was covered in aluminium siding containing thermal insulation material during a full refurbishment some years earlier, the owners and their contractors settled upon the non fire-resistant insulation because it was slightly cheaper and the building codes didn't explicitly forbid doing so. The upshot of that being that when the heat-exchanger at the back of someone's refrigerator overheated and caught fire, the flames spread to the cladding through an open window and began to engulf the entire building faster than the firefighters on scene could contain it.