"Out of order?! Fuck! Even in the future nothing works!"Because you can't spell "Failsafe" without "F-A-I-L." Thanks to Finagle's Law (or just ignorant writers), on TV a system's failsafe will never work when it's needed the most, nor will it actually be failsafe — usually it'll be quite the opposite, sometimes referred to as 'fail deadly'. The only reference to an emergency shutdown you'll be likely to hear is a panicked tech yelling "It won't shut down!" as the system runs wild. It's supposed to make the phenomenon of Explosive Instrumentation more plausible, by acknowledging it's not supposed to blow up in your face, but a failure elsewhere of a key safety lockout means it can, and will. It also justifies how something that is supposedly governed by industry-wide standards, regulatory law, and years of engineering refinements could go so horribly wrong in the first place. What's a failsafe? Well, the world is full of a lot of dangerous machinery and devices. Huge electrical turbines and nuclear reactors, Power lines carrying enough juice to light a whole city and pipelines carrying millions of tons of explosive petroleum or natural gas. Trains speeding down the tracks at 300 km/h, semi trucks that weigh in excess of 40 tons rolling down the freeways, aircraft that weigh more than 400 tons flying over our heads. And that's just the stuff that isn't designed to kill anyone. There's plenty of stockpiled bombs, missiles and such out there too. These could all cause some spectacular collateral damage if they suddenly went out of control. Thus, in the real world, things that have the potential for very destructive damage not only undergo strict maintenance procedures, but usually have circuit breakers, password protection, arming/firing keys, backups for redundancy, and prominent big bright red emergency handles that can shut the whole system down if pulled. More to the point, they usually have a totally separate set of safety features, designed to trigger automatically when the system's operating parameters get too far outside safe norms, which will (ideally) shut down the whole shebang without making the situation worse than it already is. Contrary to popular understanding, "fail safe" does not mean "safe from failing", i.e. "failure-proof"—it means that if (when) it fails, it will do so in a way that leaves it safe. When something is described as "fail safe", it means that it has been designed and built so that a critical mechanical failure or operator mistake will cause the system in question to default to its safest possible state, quickly and automatically, without any human intervention. Consider the following: if you're at an intersection where there's a traffic light, and it fails, if it "fails safe" then either it goes dark or all four directions show a red signal. If it showed green in all four directions, that would be a failure to fail safe. For more info, see the Analysis page. Real Life Failsafe Failures are often caused by an improbable and unanticipated conjunction of two or more failure conditions (one of which will often turn out to have never worked in the first place). Compare the way Hollywood treats personal vehicles when the owner is always Driving Like Crazy or leaving his car in a state of neglect, In Hollywood a decrepit car that endangers its occupants and everyone around it is frequently treated as comedy, and all too often this same insouciance extends to things capable of inflicting serious damage should one lose control of them. There's usually No Plans, No Prototype, No Backup or Lock-out/Tag-out procedures, and the Big Red Button is frequently unguarded. See also No OSHA Compliance, Override Command, Dead Foot Leadfoot, Inventional Wisdom and Plot-Driven Breakdown. Often invoked in a chain of Disaster Dominoes.
— Dark Helmet (on finding out his ship's self-destruct override is broken), Spaceballs
open/close all folders
Anime & Manga
- In Code Geass, we see one instance of mechanical failure in the standardized Ejection Seat, and that's presumably because it's being hit with microwaves that short out the electronics (and make the pilot pop like a potato in the microwave). In another episode, when Lelouch's mecha is shot down, the ejection seat has a terrible launch vector and thus ends up skipping along the ground at high speeds; it's a wonder he didn't suffer whiplash.
- In one episode this ends up being a good thing as when Guren is defeated by Suzaku and blown off a Britannian aircraft its eject fails, which looks like it might mean Kallen is going to fall to her death, but reinforcements arrive and are able to send her both repair parts and a flight pack that let her save herself and get back into the fight eventually getting past Suzaku to save Lelouch from the crashing ship. If the eject HAD worked on the other hand, Kallen would have been launched to safety but the Guren would have been destroyed and lost and Lelouch would have died because Kallen wouldn't have been able to save him.
- Averted in Naruto. Minato and Kushina both implant their chakra in Naruto's seal, so that if he tries to break it they will show up to stop him and repair the seal. This actually works. Both times.
Tobi: It's what you'd call a fail-safe... Although he only got as far as the "fail" part.
- Played straight with both of Itachi's failsafes with regard to Sasuke, though. One was to implant a genjutsu in Sasuke that would cause him to immediately use Amaterasu upon seeing Tobi's Sharingan, to prevent Tobi from manipulating Sasuke into joining him. Too bad Tobi had kept his most powerful ability secret from Itachi, allowing him to survive the attack. Tobi then invokes the trope.
- Itachi's second failsafe was to...somehow...implant a crow with Shisui Uchiha's left eye into Naruto. He set this one to trigger if Naruto ever saw Itachi's Mangekyo Sharingan again, it would trigger and brainwash the possessor of those eyes into unbreakable loyalty to the Leaf Village, under the assumption that Sasuke would take Itachi's eyes and implant them to gain the Eternal Mangekyo Sharingan. Which did indeed happen, but before that something Itachi could never have expected ruined this failsafe: Itachi was revived as a zombie and forced to fight Naruto, and thus the perfect brainwashing hit the wrong target. Itachi ultimately decides this was a really bad plan anyway.
- Neon Genesis Evangelion abuses this trope. Every second angel attack, someone has to push molly-guarded buttons, smash in the protective glass over a handle, or cut the power. Most of the time, the girl at the controls ends up shouting, "The EVA is rejecting the signal!""
- The one and only time that the failsafe system does actually work and manages to automatically eject the entry plug, it makes things considerably worse for the pilot. The mechanical systems of Eva apparently have it in for the pilots just as much as everything else.
- Granted, this is justified to an extent by the fact the Evas are only half-or-so machinery. Their human will seems to be able to screw things over pretty hard.
- Another example of it actually working would be in episode 13, when the viral angel attacks Nerv. When one of the Simulation Evas reaches for the Pribnow Box, an emergency shatter-and-pull mechanism blows the arm off, protecting the crew in the Box. The makers probably only allowed that because dying that way is not painful enough in Hideaki Anno's twisted, twisted imagination.
- Protecting the crew? It fired the severed arm straight into the glass viewing window which started to crack and leak... forcing the team to abandon the situation AND room before they were drowned in plague infected water.
- The one and only time that the failsafe system does actually work and manages to automatically eject the entry plug, it makes things considerably worse for the pilot. The mechanical systems of Eva apparently have it in for the pilots just as much as everything else.
- In Watchmen, Dr. Jon Osterman is trapped inside the Intrinsic Field test chamber by the door closing behind him when the automatic timer starts up the generators for that afternoon's experiment. As Dr. Glass puts it, "I'm sorry, Osterman. The program's locked in and we can't over-ride the time lock. It's... it's a safety feature." His last words indicate how horribly aware he is that this trope has come into play.
- Green Lantern
- The rings actually do have several failsafes which kick in, shutting down if the wearer breaks Lantern Code, reserving a small supply of energy the lantern normally can't access to protect the wearer from mortal energy and so forth. But Lanterns have been able to override the latter failsafe to continue fighting after their normal reserve is depleted (and given that lanterns are selected for fearlessness, it seems silly to allow that.) Also Hal Jordan was able to override the former failsafe after his ring was depleted for insubordination by drawing energy from a Guardian's construct.
- The Alpha Lanterns seemed to lack sufficient failsafes. While their minds were linked to the Book of Oa to make sure they faithfully executed their duties as Internal Affairs, there was no failsafe in place to stop them from being hijacked by Hank Henshaw, the Cyborg Superman, whose Kryptonian-based technology has traditionally been billions of years behind the Guardians. Possibly justified by Cyborg's machine empathy.
- Two Spider-Man examples, involving the same robot:
- In the aftermath of the Acts of Vengeance, the defeated Loki tried to get revenge by destroying New York by stealing control of three Sentinels built by Sebastian Shaw, and combining them into the titanic Tri-Sentinel, which he then ordered to destroy a nuclear power plant. As Spider-Man (who possessed the Captain Universe power) struggled to stop the thing, Shaw tried to activate a failsafe he had placed in the three Sentinels in the event they turned on him (As Sentinels often do.) Simply put, the program would reveal to a Sentinel that, since their abilities were "inherited" and improved upon from the original Mach-1 Sentinels, they are technically mutants. In theory, this would act as a Logic Bomb, causing a rogue Sentinel who has this revelation thrust upon it to destroy itself, as its directive is to destroy mutants. Unfortunately, Loki's sabotage had seriously screwed up the Tri-Sentinel's programming, and the failsafe didn't do anything more than confuse it for a couple of minutes. Still, that small delay was enough for Spidey to bring the Uni-Power to its full potential and blow it to dust in a climactic finish.
- The Tri-Sentinel appeared again later, this time in the hands of Life Foundation president Carlton Drake who thought he could use the Tri-Sentinel as security for his latest "luxury escape condo". Knowing that Sentinels could regenerate, he had his men gather its remains, and when it was almost whole, he installed what he thought was a better failsafe: a chunk of the incredibly rare Antarctic vibranium (a substance that melts metal) placed it in a special container that nullified the melting effect, and put the container near the robot's brain, thinking he could simply deactivate the container if the robot turned on him. As you might expect, when he activated the creature and ordered it to crush Spider-Man, ignoring the hero's pleas, it quickly deleted the programming he had installed and resumed the mission Loki had given it, and when the panicked Drake tried to activate the failsafe, he found that it had installed a failsafe of its own, preventing him from deactivating the container remotely. (Because fighting it the way he did before was out of the question, Spidey made his way inside the creature, fought his way past internal defenses, up to its head, and deactivated the container manually. What happened next was ironic; the Tri-Sentinel clearly didn't even have anything close to a failsafe for dealing with the Antarctic vibranium as it started to melt its brain, because, as Spidey put it, "Nothing like this has ever happened to a Sentinel before." It tried to expend power to regenerate itself, but the vibranium kept melting it, forcing the Sentinel to expend more and more power, until it was literally vaporized.)
- One engineer was so annoyed by the persistent number of Failsafe Failures in Star Trek that he wrote a fanfic about a leading engineer in the Star Trek universe being put on trial for negligence.
- Defied in the Star Trek Online fic Bait and Switch. The USS Bajor is a late-model Galaxy-class with an experimental warp core that operates using minimal fuel in the chamber, meaning there's no need for dilithium to moderate the reaction and a shutdown consists of merely turning off the fuel. Also, as specified in the TNG Technical Manual that the various shows' writing teams apparently never read, its core ejection mechanism operates on the deadman switch principle and is actually an anti-ejection mechanismexplanation . Worth noting, the author is a troper and has read the above fanfic.
- Invoked in the Harry Potter fanfic Make a Wish. Portkeys are made with a variety of safety features that prevent users from apparating in mid-air or inside a space too crowded or too small to contain them. The Death Eaters made the mistake of getting Portkeys from a wizard that doesn't like them, and is in fact giving them unsafe portkeys with the intention of getting them killed.
- In Marionettes, it turns that this trope is why Trixie (who discovers she's an android) escaped the Stallions in Black who were chasing her's control: the Alicorn Amulet messing with her head fried her failsafes and freed her. Twilight later invokes this trope to free Lightning Dust the same way.
- In the Star Wars galaxy, as a rule, if you destroy a single control console for some piece of technology, that technology will immediately and completely fail. This can range from door/bridge controls (A New Hope) to the absolutely crucial deflector shields protecting a mining outpost on a volcanic planet (Revenge of the Sith). In Return of the Jedi, the 19 kilometer-long Super Star Destroyer Executor goes into an instant nosedive when its main bridge gets destroyed by a rebel fighter, with the thousands of crew members scattered throughout the ship apparently unable to do anything to prevent it.
- Dr. Strangelove has a pretty much identical plot to Fail Safe (see Literature below), but the attack is a result of human intervention rather than mechanical failure (although it is a mechanical failure that prevents one of the bombers from being recalled).
- The Alien franchise:
- In Alien, the spaceship's self-destruct fail-safe mechanism is virtually impossible to initiate by accident, but it is just as fiendishly obtrusive to abort it. There is no quick reset button. It will not override without first manually disengaging safety interlocks and inserting the rods back in. If a last-minute-decision was ever made to abort, you're basically screwed.
- In Alien: Resurrection, the Auriga is programmed to automatically return to Earth in case something goes wrong. Unfortunately, the Auriga is the site of an alien breeding and testing facility, which is the absolute last thing you want near an inhabited planet.
- This is parodied in Spaceballs when, after the Big Red Button is pushed activating the self-destruct, the computer says in the last few seconds that they can stop it by pressing a button to cancel. The button, of course, has a big "out of order" sign hanging on it, which prompts Dark Helmet to shout, "FUCK! Even in the future, nothing works!"
- In Live Free or Die Hard the bad guys blow up an entire natural gas facility by routing all the gas to it. There would actually be dozens of failsafes to prevent the necessary overpressure from breaking anything at all, much less exploding. Of course the bad guys used the power of Hollywood Hacking to pull it off, since computers are magical and none of the failsafes are purely mechanical, either.
- A textbook example from The Machinist: a worker is repairing a broken machine when someone accidentally leans on the On button (which is only possible because the workshop has No OSHA Compliance whatsoever). Hammering on the Off button does absolutely nothing, the repair worker is dragged into the machine and loses his arm. It's not clear what was wrong with the machine to start with, but it might have been a good idea for someone to disconnect the power before sticking his arm in there. It was made clear that the machine was supposed to be locked out, but the manager had previously reprimanded employees for taking too much time to get equipment fixed. That kind of pressure is definitely illegal, but happens more often than you'd like to think (especially in small shops with narrow profit margins). The reveal that the main character is insane and frequently hallucinating might explain it, however.
- The end involves a Runaway Train with the emergency brake disabled, and no Dead Man Switch. And it didn't trip any overspeed controls either.
- Then there's the elevator at the beginning. Payne destroys the emergency brakes with bombs, then it turns out the crane Jack and Harry hooked to the car to secure it couldn't hold the weight either.
- In Capricorn One, a government agency specifically defeats every failsafe on Robert Caulfield's car. Not only do the brakes fail, the throttle gets stuck wide open, the gearshift is locked so he can't go into neutral, and even turning the key off won't cut the ignition.
- In the notorious Irwin Allen disaster flop The Swarm (1978) the killer bees attack a nuclear power station, and cause it to blow up almost instantly when one of the technicians falls across a random instrument panel. Also the actual core is completely exposed to the air without any evident shielding.
- James Bond
- In Thunderball, an assassin tries to kill Bond by turning up the setting on a spine-stretching exercise machine he's strapped into. Bond blacks out and is only saved by a nurse happening to enter the room just in time. Leaving viewers to wonder why the hell the machine was even designed to be able to go that fast.
- In Moonraker a mook tries to kill Bond by disabling the chicken switch on a centrifuge and cranking the spin rate to unsafe levels. Not so much Failsafe Failure as intentional tampering, but why would a piece of equipment designed to test human endurance have the wires to the safety switch connected to a plug easily removable by the controller, and why would it go up to speeds considered dangerously unsafe for humans in the first place?
- Michael Bay's The Island has two notable examples of this. A truck carrying a massive load of train wheels loses its entire load when a single strap is released, then at the climax throwing a single breaker switch causes the entire mechanism to explosively fail. But then being a Michael Bay film, having things blow up is to be expected.
- Justified in The Taking of Pelham One Two Three where the safety devices on a New York Subway train are actually a plot point. The police believe the Dead Man's Handle will prevent the villains jumping off the train while it's moving, but they've actually rigged up a system to hold down the lever. Later as the train appears to be careening out of control, it's eventually stopped by the safety devices built into the track.
- In Outbreak, a lab technician is infected with The Plague when he carelessly opens and reaches into a centrifuge while it's still spinning, breaking a vial of infected blood and cutting his hand. In Real Life, lids on most (but not all) centrifuges lock until the spinning has completely stopped; for these models, it's impossible to open one while it's still in motion.
- In The Andromeda Strain, the lab has a nuclear self-destruct device, with three substations (to disarm the bomb) per floor, but it's discovered they need five per floor, and are in the process of adding them, but they haven't been finished (this is a government installation, of course). When the self-destruct countdown is activated, team leader Stone, along with the only team member who has the shut-off key, are trapped in a section with an unfinished nuclear destruct shut-off substation. Stone cries out, "When the bomb goes off, there'll be a thousand mutations! [The virus] Andromeda will spread everywhere, they'll never be rid of it!" He touches the other team member and points at the exposed, unfinished shut-off substation. "The defense system is perfect, Mark, it'll even bury our mistakes."
- For something less high-tech, Sylvester Stallone's 1993 Cliffhanger starts with Gabe (Stallone's character) climbing up a mountain to rescue friends Hal and Sarah. To get to the rescue helicopter they have to pull themselves on a line stretched across a chasm, suspended by their climbing harness and a carabiner (a big metal clip). When Sarah is in the middle of crossing, the carabiner starts to buckle; Gabe goes on the line to catch her, but he is too late and she falls to her death. The problem with that scene is that a carabiner is designed to withstand the weight of a falling climber. A standard one would have a rated strength of 23 kilonewtons, while the static load of a of a Hollywood starlet would would be around 0.5 kilonewton. At that point, viewers who knew their climbing may feel their Willing Suspension of Disbelief shatter with the carabiner. The studio was actually sued by the carabiner manufacturer as a result of this scene and the inaccuracy of it breaking in such a scenario; when it snaps there is a close up of the carabiner with the manufacturer's logo prominently displayed.
- Wing Commander begins with the Kilrathi attacking an outpost in order to capture its navigational data. Realizing why the enemy is attacking, the commander of the outpost orders the navcom, which is located in a sealed room, destroyed. However, the Self-Destruct Mechanism refuses to work, due to sabotage.
- This sort of thing seems to happen all the damn time in the Final Destination franchise, to the point that it's a wonder that horrific freak accidents don't happen constantly. Some of these failures, though apparently outlandish, are still within the realm of possibility, however unlikely.
- Averted, subverted and justified in quick succession in the climactic scene of the movie version of The Hunt for Red October.
- The Aversion: To 'sell' the appearance of having destroyed the submarine Red October to its recently-evacuated crew, the US Navy attack it with an air-dropped torpedo. The torpedo is successfully aborted before impact in the scene which became Trope Namer for I Was Never Here.
- The Sub Version: The commandeering of the Red October is interrupted by the arrival of a Russian Alfa-class attack sub, which launches a torpedo. Captain Ramius orders the sub steered into the torpedo's path at full throttle, closing the distance before the torpedo's warhead can arm itself - its safety features work a little too well.
- The Justified Version: In response to this failed attack, the commander of the Alfa orders all safety features disabled on his remaining torpedoes. As a result, after playing tag with the next torpedo he fires, the commander of Red October is able to decoy it into locking onto the vessel that fired it, destroying the Alfa.
- The B-movie Evolver involves a failed military robot (which ended up killing dozens of soldiers during a training exercise) being re-purposed as a household laser-tag toy. Needless to say, the robot reverts to its original programming and starts killing people. When its creator attempts to use the verbal shutdown code that worked during the training exercise, the robot simply rejects the override and kills the guy.
- Inverted in WarGames. JOSHUA doesn't have a failsafe, it has a "fail-deadly" Gone Horribly Right instead. The General asks why they can't Cut the Juice to the hacked nuke-controlling master computer. The computer tech explains that if the slave computers in the missile silos don't recieve a signal from the master, they will assume that the master computer and NORAD are destroyed, and spin up and launch everything.
- In Jurassic World, Owen and two other park employees try to escape the Indominus rex's exhibit. There is a door panel on the inside, allowing them to open the exit. While one worker gets out, the control room has to try and shut the door after that because the I. rex is too close behind Owen. The door closes very slowly allowing both Owen out and the dinosaur to stop it from closing all the way.
- In Star Trek Beyond, the air processing system on starbase Yorktown has all kinds of elaborate safeguards to prevent anyone from tampering with it via the computer network. But a person can simply take an elevator to the roof of the building it's on and release a bio-weapon with ease even as people in the command center struggle to overcome those very security protocols to try to stop them.
- Obviously Fail Safe (the book, movie and TV drama) qualifies. The American strategic nuclear forces have a system in place to prevent bombers from attacking the Soviet Union without clear authorization - bomber crews are conditioned to turn back at the fail-safe line no matter what is happening around them, if they haven't received the go signal themselves. In this case it's an "active go" system, designed to send out an attack order - and it does so incorrectly, due to a subtle and unnoticed technical fault.
- Norman Moss, in the book "Men Who Play God", makes it clear that this is not how Real Life works - the "go" signal is a voice order that must be given by a human being and cannot be transmitted accidentally. Nevertheless it remains a cautionary tale against the dangers of too much automation in military systems, a thing the American President and Nikita Krushchev are left bemoaning to each other near the end of the book, as the last bomber closes unstoppably on Moscow...
- Mentioned in Mostly Harmless:
- Subverted: it is explained that a set of special bulletproof windows are not designed to be shot at from inside. They can also be jimmied open with just a credit card. This is because of 'The Great Ventilation and Telephone Riots of SRDT 3454'. The main cause of the riots was a building environment control system. Part of the installation process involved sealing the windows shut, to make it easier for the system to do its work. One particularly hot day, many of these systems broke down, resulting in the overheated office workers taking to the streets. As a result of the riots, buildings were required to have windows that opened.
- In the same book, it's mentioned that the difference between something that might go wrong and something that "cannot possibly go wrong" is that when something that cannot possibly go wrong goes wrong, it's usually impossible to fix.
- In the Discworld novel The Light Fantastic, magic is weakening on the Discworld. This causes people to riot against wizards. Good thing Unseen University has some big, heavy doors. Too bad the only locks are magic spells, with no good, solid steel lock.
- Averted and lampshaded a bit in the book 2001: A Space Odyssey. The makers of the failsafes of the airlock doors had mentioned, "We can protect you from stupidity, we can't protect you from malice."
- In both the book and the movie of The Taking of Pelham 1-2-3, the criminals hijack a subway train, and demand that all of the signals all the way to South Ferry be set to green. The train then rolls through every signal and station, at speeds exceeding 80 miles per hour. Note that the train does stop when it gets to South Ferry as it is going too fast for the turn and trips the overspeed control. However it's interesting that it never tripped any overspeed control along the way.
- In the climax of The Shining (the Stephen King book), lead character Jack Torrance desperately tries to cool down the main boiler of the Overlook Hotel, while Danny, Wendy and Mr. Hallorann escape on a snowmobile. At first, it looks as though the boiler (which has to be constantly maintained) will return to acceptable levels, but the pressure is already too great, and the boiler blows up, taking Jack, the hotel and the topiary animals with it. Justified in this case as the boiler is explicitly described as both very old and very dangerous; the hotel manager has been bribing the safety inspector for years to keep it from being forcibly replaced.
- As above, The Andromeda Strain. The bacterium mutates and destroys gaskets... that are protecting the lowest, most secure level from being contaminated. A nuclear bomb is set to destroy the base, and since the bacterium mutates with levels of energy, it'll cause a worldwide outbreak.
- In The Stand, the engineered superflu virus nicknamed "Captain Trips" is accidentally released from a top secret installation in the High Mojave, and, unfortunately for the rest of the world, a security guard is able to escape because the doors to his station (which he thought erroneously to be "clean") did not magnetically lock at the moment of the installation's containment breach. The guard takes his family and flees, making it all the way to East Texas before dying. General Billy Starkey, the man charged with the containment operation, later comments on this fact.
- In Dave Barry's Big Trouble, it is mentioned that the corrupt Mega Corp. built a new prison in downtown Miami using off-the-shelf garage door openers to power the cell doors open and shut. Someone accidentally hit their garage door opener button while driving by soon after the jail was filled, and every door in the place opened. Hilarity Ensued.
- Justified in the Thursday Next book Lost in a Good Book. The nanomachines that unstoppably convert all organic matter into Dream Topping are contained in an extremely strong electro-magnetic field. The field is maintained by three generators, all of which would have to fail simultaneously in order to release it, an astronomically unlikely possibility. They do fail, though, because the villain Aornis Hades has the ability to manipulate coincidences.
- On Star Trek: The Next Generation: A standard Holodeck Malfunction episode goes like: Computer, freeze holodeck program! (pregnant pause) Computer, exit! (slaps combadge) Picard to Bridge! (silence) ...Oh. Shit.
- Averted in the DS9 novel Valhalla. A sentient, suicidal starship tries to blow itself up by running its fission pile too hot. (Un)fortunately, a mechanical failsafe triggers, wrecking the drive in the process.
- In the Star Trek: Deep Space Nine novel, Time's Enemy it is shown that the self-destruct command for Jem'Hadar ships is simply "Destruct" in their own language. There is no override, there is no countdown. This is simple and, with the mindset of the Jem'Hadar, the perfect method.
- It's also fairly secure. Jem'Hadar learn foreign languages ridiculously quickly and never use their own language in the presence of aliens, so there's no risk of an intruder arming the self-destruct.
- In the Starfleet Corps of Engineers series, a Federation space probe in one story (actually entitled Failsafe) suffers from this, requiring the crew undergo a mission to retrieve it from a pre-warp planet. Sonya Gomez even seems to lampshade the improbability.
- A Warhammer 40,000 short story by Graham McNeill had a ultra-maximum security prison in which all the doors automatically unlocked in the event of a power outage.
- Starships in Honor Harrington, especially warships, are built with numerous failsafes that usually work as intended. Occasionally though, Weber falls into this trope, such as when one of the circuit breakers protecting a fusion plant from power surges is itself knocked out, destroying the ship. Such failures are generally the result of combat damage or sabotage, as the engineers know that if the failsafes on certain systems (Notably the reactors and the inertial compensators) break down, they'll all be dead before they can try to fix it, so they make certain that critical systems are in good repair at all times to prevent spectacular accidents.
- Averted in Children of the Mind, where an active MD Device is quickly and easily disarmed by a technician. The tech notes that the planet-atomizing superweapon was deliberately designed to be easy to turn off, with the missile having instructions printed on it to explain how to do so. "Now, turning it on, that's hard." Good thing it had been designed like that, as in this book the "Dr. Device" is fired at the planet Lusitania but the protagonists use their new teleportation tech to send it back inside the ship that fired it.
- In Sergey Lukyanenko's Emperors of Illusions (part of the Line of Delirium trilogy), Arthur van Curtis holds the command crew of an Imperial cruiser at gunpoint while the ship is in hyperspace. He orders the crew to prepare to drop the ship out of hyperspace without first slowing down. In this case, the ship enters normal space at relativistic speeds and, by the time it will slow down, decades or even centuries will pass for the rest of the universe. This has happened before, and yet nobody decided to make deceleration a standard part of dropping out of hyperspace instead of simply a step that a crewmember might one day forget to do.
- French Sci Fi novel Malevil briefly considers this. World War III occurs and nobody is certain why it happened, they lived through it and yet the lack of information and details turns it into the Great Off Screen War. One of the possible, never to be confirmed, theories as to why the world ended was Failsafe Failure.
- In the novel ''The Dorset Disaster', a nuclear reactor explodes due to this, because someone tampered with the settings that controlled when the reactor should SCRAM-In a bit of similarity to the Real Life Chernobyl incident, it was SCRAM-ing too often and annoying the people running the plant. So they changed the settings, and that led to a big kaboom.
- When the Arrandas visit the Nightmare Machine at Hologram Fun World, in Galaxy of Fear, they're told that the simulations, which are terrifying, will end if they say "End simulation!" Of course, it doesn't work because the Nightmare Machine is actually a psychic monster, and not interested in letting them go..
- Subverted in Timothy Zahn's The Conquerors Trilogy. With a human ship about to fall into hostile hands, a failsafe is activated so that the computers are purged of data before it can be captured. It works, but it turns out the captain had a computer of his own in his desk not connected to the system on which the data was duplicated.
- In Zeroes, a police station has holding cells with electronic locks. They're supposed to remain closed and locked even in the event of a power failure; but when Crash uses her Walking Techbane powers to destroy the police computer systems, the failsafe fails and a number of prisoners escape.
- In every science fiction TV and film from the 1960s on where combat occurs, it's clear that no one is familiar with circuit breakers and similar devices. Everything electrical, from generators to controls routinely catch fire or explode, despite there being nothing in the typical device that is capable of such reactions. Even old-style fuses would be better than what normally happens in science fiction when there is an electrical problem.
- This happens all the time in Star Trek, where fail-safes are almost never shown or mentioned unless they fail. One fan group came up with the name "SINEW" for this phenomenon: "Somehow it never, ever works."
- The warp core is chronically incapable of being shut down in case of emergency. This gets to the point where, in the later series, systems are built into ships to eject the core from the ship when it's about to explode. These also always fail. Fridge Logic says they would have to first cut off the antimatter fuel input to safely eject the core, which should mean the reactor shuts down and doesn't need ejecting anymore... but apparently the only time the writers remember the ship even has fuel is when they do a running-out-of-fuel episode. The tendency for the antimatter core to suddenly end up in a runaway overload situation is indicative of the core constantly being at a dangerously high power level. Compare this to modern nuclear reactors, where it's kept at JUST high enough of a level to keep things running while being easy to shut down (as well as self-stopping) in the event of a crisis.
- The classic Holodeck Malfunction requires four conditions that you'd think would be unlikely to all happen at once: the exit door, the off-switch, and the safety protocols must fail, and everything else must continue to work perfectly. Apparently, this happens almost every time the holodeck is used on-screen. There is no easy way to turn it off even from the outside, as for some inexplicable reason the holodeck is the only system on a starfleet ship to have an independent power supply, and they can't shut it down any better than the warp core. And why is the holodeck designed to accurately simulate dangerous things like bullets and mustard gas anyway?
- For some unknown reason, a lieutenant, albeit the Chief Engineer, has sufficiently high administrator privileges on the Enterprise-D's computer that he can not only accidentally lock himself out of the system, but also everybody else all the way up to the captain! Even worse, this also allowed an accidentally-created sentient hologram being generated by the computer itself to gain those same absolute administrator privileges! (TNG: "Elementary, Dear Data")
- There is no way that the bridge life-support systems would fail on their own, as Geordi points out; there are seven independent interlocks to prevent it. Data is able to seize absolute control of the ship's computer, mainly by virtue of the fact that, having sophisticated speech capabilities, he can precisely mimic Picard's vocal patterns and fool the voice biometrics authentication that the computer uses. The fact that "Picard" is giving verbal orders to the ship's computer from the bridge when he is actually in main engineering (location supposedly being something that the computer tracks) does not hinder Data in any way. It could also have been avoided by just requiring more biometric controls than just Picard's voice (like scanning his eyes and face too). It's really unsettling too seeing how easily one android hijacks the entire Enterprise, and none of the security holes this reveals are ever mentioned or shown to be fixed. (TNG: "Brothers")
- Particularly ridiculous is the season one finale of Star Trek: Voyager, where the "manual" override on a door lock is shut down by a power failure, negating the very purpose of a manual override in the first place. As SF Debris put it, "That's like having an emergency light that plugs into a wall socket, or a parachute with a rope attached back to the airplane."
- In the episode "Unexpected" of Star Trek: Enterprise, the handrail on the lift in engineering is capable of severing limbs, as there is no cut off if there's resistance. The only person who views this is a problem is supposedly irrationally anxious due to pregnancy. The person he's talking to questions why anyone would put their hand on the handrail.
- The brigs on Starfleet ships default to open in case of power failure. Even if they need the forcefield to prevent "jail break" via transporter, there's no reason they couldn't also have a thick steel door (not getting into the many natural minerals that have stopped transporters). There isn't even a backup power supply. The same applies to the medical quarantine in sickbay, and to the force fields that are frequently placed around hazardous life-forms and other suspicious specimens. This is not to mention the fact that force fields are also used to seal hull breaches in combat, meaning that in the event of power loss, which is likely to occur if one's shields are down, the ship is open to space.
- In one episode of Battlestar Galactica, two characters stuck in a leaking airlock are told that the "manual override" for the door had failed. This provided an excuse to space the pair in an over-the-top CG-fest. It's good to know that no major space craft will ever carry a crowbar. The fact that the Galactica was old when the series began and has gone several years (and multiple battles) without adequate maintenance at this point provides a thin veneer of justification, but it's very thin.
- Justified in an episode of The Outer Limits (1995); the technicians who created a planet busting bomb are abducted and being tortured by aliens to provide information on the bomb. The bomb itself is sitting in the room. The technicians find out how to bypass all of the safety measures and arm the bomb...only to find out they are on Earth undergoing a psychological stress test. Of course they used the real bomb.
- MythBusters: If they're not going to ridiculous lengths to defeat the failsafes on some common household item so they can replicate a myth's results (translation: make something explode), chances are you'll find them putting out a fire or running to catch a driverless vehicle because one of the failsafes on something they made has failed.
- Specifically, the tendency for their remote controlled cars to run wild and take out the chain link fence of their test area has become a Running Gag. The failsafe is supposed to apply brakes to the car if it loses radio contact. Some of them were genuine failures, some of them were somebody forgetting to set it. One time Jamie forgot to set it before Adam jumped in for a ride...and scratch another fence.
- Sometimes they do consider a Myth "confirmed" if the fail safes preventing the results they wanted could reasonably be disabled by a normal person. For example, the "water heater rocket" myth required them to close off a safety valve that would prevent the thing from launching through the building and into the sky since to a normal person, the water it drips may appear to be just draining their water bills.
- In one instance they had to disable the safeties on an elevator. Adam eventually reported, "Anticlimactically enough, I believe I've disabled the entire mechanism by removing this simple pin." Granted, it was a very old elevator in a condemned building already slated for demolition. Wonder why...
- One subversion: In an episode dealing with a car bumper being fired like a rocket due to the shocks failing, the guys tried everything and were defeated by at least 4 failsafes. They then talked to a person who had had both legs shattered by the exact failure happening.
- Played absolutely straight in Season 4, where Marwan The Wonder Terrorist manages to steal a MacGuffin that can cause every nuclear reactor in America to go into simultaneous meltdown. How one electronic device could untraceably hit over 100 sites at once AND bypass the dozens of failsafes, manual breakers, and shunts present at every site they don't even try to explain (let alone WHY an American company would make such a thing).
- Inverted in Day 8 where a suicide bomber blows up because of the failsafe.
- It's actually incredibly common in SG-1, as well as its sister series, Stargate Atlantis. This was lampshaded in an episode of Season eight entitled "Avatar" when the weekly reverse-engineered alien tech malfunctions, and General O'Neill responds with "I though failsafes were supposed to be safe from failure."
- The Stargate system itself has plenty of failsafes, however, as presumably it was designed to be used by other races, without such devil-may-care attitudes about personal safety. Unfortunately for the SGC, they're operating their Stargate without the standard control device, and feel free to completely disregard any warning signals from the Stargate itself.
- Even with the "safe" Stargates, ones operating with the original control computer, a gate activation completely vaporizes anything standing in front of the ring with only a few seconds' warning.
- There is a safety to prevent the gate from closing when something is halfway through - on one occasion, Jack holds a gate open (to prevent the villains from dialing another location and escaping) by not withdrawing his arm from the event horizon after he arrives. The only time there is a threat of something getting cut in half is when the gate hits the 38 minute time limit, and since its physically impossible to keep a wormhole open longer than that (except under special circumstances), no failsafe could possibly prevent that.
- Also commonly inverted in Stargate Atlantis, where Atlantis often activates failsafes during bad situations. it inevitable does exactly the wrong thing for the given situation, forcing McKay to waste precious time overriding the failsafe or killing someone when he fails to.
- The Pegasus Galaxy also has Spacegates, and there's no way to tell whether you dialed an address for a gate on the ground or a gate in orbit besides stepping through (or checking the Ancient database, but not every civilization has that option). The Pegasus network was designed for Lantean spaceships, but if a failsafe was ever needed in their designs, it's here. It was established a few other points in canon that there's a whole lot of "error codes" that the gate sends back to the DHD, and it's set up to not allow things like water rushing through from an underwater environment (while still allowing solids). Since the system isn't completely understood by anyone, it's possible there's a "Are you really sure you want to step through into space" warning light.
- Failsafes built into the Stargate network, as well as other Atlantean systems, have been mucked around and hacked so much by the Atlantis expedition (in order for Earth PCs and computers to interoperate) that they might as well not exist. Consider "Avenger 2.0" where Felger's code is none too secure, or where McKay's software updates "broke the Gate". Of course, it wasn't really broken, but nevertheless, there were glitches because of this (hence Sheppard's 40,000 year trip into the future and then some).
- Also, the Ancients aren't big fans of failsafes. Ancient devices need to have a function as complex as time travel before someone will even consider putting in a failsafe. The failsafe that prevents Atlantis from being crushed underwater after it has run out of energy wasn't even included until someone traveled to the past from a future where the city and its occupants were flooded, for instance... Quite natural because the Lanteans probably didn't forsee that they would submerge the city on the bottom of an ocean and then leave it unattended for 10,000 years.
- Atlantis itself is the mother of all failsafe failures, a spaceship without an airtight interior. If the shield fails in space, everyone dies, as the crew unpleasantly discovers when they run into power problems while moving it to another planet.
- The fact that you can remove the safeties on a ZPM is a biggie, you'd have thought that those things would have built-in safeties.
- In Stargate Universe:
- The Destiny's emergency atmosphere-retention forcefields are only strong enough to reduce the flow of air through gaping holes in the hull, not stop it. It's possible that they used to be stronger but are now past their best-by date. They also may not have been designed to compensate for such extensive deterioration.
- One jammed-open airlock door is apparently enough to drain the air from all of the remaining habitable areas of the ship, with no other doors capable of being sealed to further compartmentalize the area. Granted, the ship has already had a lot of sections sealed off for this very reason, so perhaps this is simply the case failsafes can be driven past their limits by repeated failure. But one would think that having an airlock and the Stargate in the same emergency-seal compartment would be a bad idea.
- In a case where a working failsafe caused problems, the shuttle docked at said airlock wouldn't close its own airlock door without someone inside it to operate the controls. This probably prevents you from locking yourself out, but in this case meant a Heroic Sacrifice was needed to stop the shuttle from leaking all of Destiny's air.
- Destiny's atmosphere recyclers packed it in over the millennia and the ship automatically stopped off at a planet where needed chemicals could be found to repair it. But the ship's autopilot was still set to take the ship back to hyperspace after a fixed period of time, whether or not the chemicals had been recovered by then.
- This trope is part of the show's stock in trade. Most of the drama is derived from either yet another of Destiny's millenia-old unmaintained systems failing catastrophically, or the failsafes working, but none of the human crew knowing what the failsafe procedure actually is, and everyone panicking. When the proper operating procedures include diving into a sun to refuel, the panic is understandable.
- Averted in one case with an electrified corridor. There's a handy manual breaker just to make sure you can kill the power. However, the fact that a corridor can, without warning, turn into an electrified deathtrap, invokes another trope.
- It gets better, the ship's capable of getting into your head. In "Trial and Error" when Destiny sensed Young was going through a mental breakdown, it initiated a program that caused Young to continually have vivid dreams and hallucinations of the ship being destroyed, evaluating if he had the ability to continue commanding the vessel. Let that sink in... the ship saw someone suffering a mental breakdown and felt the best solution was to do something that nearly drove him insane.
- Red Dwarf:
- In the episode "Demons And Angels", Lister says the ship is full of fail-safes; "The actual chances of it exploding are one in—" Red Dwarf explodes. "... one."
- During the Time Skip in after the series prologue, Rimmer also managed to flood the entire crew compartment of the ship with lethal radiation, because he conducted a repair without proper assistance. Ironically, the cargo decks were safely sealed during the event. Why Rimmer was even allowed to touch the ship's nuclear reactor, especially without assistance, is never clarified, but Kryten later successfully defends Rimmer in court with the defense that the person responsible for the accident was the one who gave such an obvious incompetent a job that important.
- The novelisations have the accident unfold slightly differently, and notably it isn't Rimmer's fault.note Three important warning lights fail to activate, and when several more come on later the engineering watch-stander blames them on the coffee he just spilled all over his keyboard.
- Averted in Angel, where Lindsay tried to activate Wolfram and Hart's failsafe, but Angel's people stopped it from getting loose.
- Averted in Terminator: The Sarah Connor Chronicles wherein a nuclear power plant's failsafes DO kick in, but a T-888 deliberately sabotages them one by one.
- In the Supernatural episode "Devil May Care" (S09, Ep02), Kevin is locked in the bunker unable to contact anyone after the sensors detected the falling angels.
- This also happens in an episode of The Professionals. The components of Doyle's car are set to fail one by one as he's going down a hill. Justified though as the killer isn't trying to fake a car accident; he's just playing with Doyle before he kills him.
- In the season two finale of Mission: Impossible the team has to retrieve a failsafe device from a B-52 that failed to self-destruct when the plane was shot down by an East Bloc nation. The reason they have to secure the device rather than just destroy it to prevent the Communists from reverse engineering it and figuring out a way to prevent other failsafe devices from working is because the original manufacturer needs to take it apart so they can figure out why this device failed to self-destruct and ensure that the other devices from that production run don't suffer from a similar defect.
- Combined with The Guards Must Be Crazy in the Supergirl episode "The Darkest Place", where the Fortress of Solitude's secruity robot, Kelex, is fooled into thinking Hank Henshaw is Kara by Hank dumping a vial of blood onto the Fortress' console, even though Kelex is looking right at Hank and should see that he isn't her.
- Referenced in the Mystery Science Theater 3000 Episode Gamera:
Soldier: "The electrical shocks don't seem to bother Gamera at all!"Servo: "Hm, and I was counting very heavily on them..."
- In most episodes of Thunderbirds, disasters were caused, or at least not averted, by faulty safety equipment or poor engineering. Examples included bridges that collapsed as soon as their maximum load limit was exceeded, aircraft whose nuclear reactor shielding failed if the flight was delayed, failure to survey sites properly before beginning major engineering projects, and numerous vehicles without a Dead Man Switch or equivalent.
- Subverted in Paranoia, as Friend Computer is pleased to report that, before being deployed on Troubleshooter missions in Alpha Complex, any theoretically-dangerous devices, accessories, networks, REDACTED and/or systems are uniformly equipped with unbreakable failsafes which have been rigorously tested in the field by dedicated and expert Troubleshooter teams.
- Ace Combat 5: During an airshow turned firefight, Chopper gets shot up by enemy forces, damaging most of his plane's internal systems. He still flies for a minute or so to find a safe place to land, but by that time, his eject system is damaged too heavily for him to eject. He dies in the following crash.
- Half-Life: When things go awry at the Anomalous Materials research department of the Black Mesa science facility, the classic exchange is heard:
"Shut it down!""It's not-[BANG] it's not-[BANG] it's not shutting down! It-[screams]"
- Justified example, as it is heavily implied that the incident was orchestrated by the G-man.
- And a whole lot of safety protocols were overridden or just plain ignored because Breen said so (possibly related to the above). One of the scientists comments on how "We've assured the Administrator [Breen] that nothing will go wrong (cue meaningful look at other scientist)", implying that the science team knows this and is grudgingly working around it.
- Plus most of the Black Mesa systems were too old and were never meant to be used the way they were using them.
- Justified in Half-Life 2: Episode 1. The Citadel's dark energy reactor had a failsafe that was deactivated by the Combine as part of a Xanatos Gambit. Reactivating the failsafe doesn't prevent the reactor Going Critical as its condition had already deteriorated, but it slows the process enough to allow the rebels to evacuate.
- In the Portal series, we find that GlaDOS had an emergency red phone in the control room, so scientists could call if she had problems. Too bad bad the cable was cut, preventing anyone from calling to report she was about to kill everyone with deadly neurotoxin.
- Subverted in the opening of Xenogears. When an alien threat is taking over the ship, the command crew attempts to cut power using the emergency blocker, which is a 3 foot section of the power cables which jettisons itself out, creating a massive break in power and electronic communication. Unfortunately the alien threat manages to arch the gap and continue its takeover. Their second failsafe (self detonation) works.
- The game 7 Days a Skeptic features, on an advanced space ship, an escape pod door that opens whether or not there is an escape pod that can be boarded behind it. Almost needless to say, if there's no escape pod, one is greeted by hard vacuum.
- Referred to by name in The Stanley Parable. In the explosion ending, there's a button that, when pressed, shows a message on a screen reading "Failsafe Failure". No button in the room can stop the explosion and save Stanley.
- The Resident Evil series has at least one in every single game, mainly for the purpose of locking you into whatever room has the latest mutated/undead boss. As well, even the Failsafe Failure fails as the door lock mechanism invariably releases the second you kill the Big Bad du jour. The best example of a needlessly complex failsafe comes from the train braking system in Resident Evil 0. To activate the brakes, you must:
1: Find the brake instruction manual in the front car.
2: Pick up the card key for the brake system.
3: TRAVEL THROUGH THE ENTIRE TRAIN to the brake-lock system located outside of the caboose.
4: Insert card key.
5: SOLVE AN ADDITION PUZZLE.
6: TRAVEL BACK THROUGH THE ENTIRE TRAIN to the front car.
7: SOLVE THE SAME ADDITION PUZZLE A SECOND TIME.
8: (really, the only step that should be here) Pull brake lever.
- Slightly hand waved in the RE storylines in that the designers of pretty much everything in Umbrella Corp. and Raccoon City were utterly insane.
- An Interactive Fiction game titled Fail Safe puts the player in the role of a computerized emergency help system. Someone in a crisis is calling for help on the radio, and you have to consult your database and give him instructions on what to do. There is a twist of the Tomato Surprise variety.
- Metal Gear Solid:
- There's a failsafe failure that is actually part of Fox Hounds plans. You receive a key which is part of the override. But the game wants three keys for the override. Thus, you need to heat the key up to make it the shape of the second slot, then cool it down for the third shape. It's kind of lucky that the base holding Metal Gear Rex contains a foundry, and is in Alaska, otherwise how else would you use the key? This is made even more baffling when you consider that the nuclear missile will launch with only two codes, yet three keys are needed to override and even worse, said override key will actually arm the weapon if it's in a disarmed state.
- More intelligently integrated into Peace Walker, which is a "fail-deadly" nuclear device designed to guarantee a retaliatory nuclear strike in the event of an attack, creating true MAD (the idea being that humans might not be able to launch a retaliatory strike, knowing they'd wipe out an entire country). Unfortunately, the flaw is that it doesn't personally check to see if nukes are being launched, and the story revolves around someone feeding the machine fake data in order to make it strike.
- Fallout 2:
- A thorough letter was found about what would happen if someone disables the Oil Rig's reactor cooling systems: a megaton-sized meltdown (not that it's possible in real life, but then Fallout's nuclear physics have always ran on the power of imagination). Guess what you need to proceed: hack the control system or simply blow it up. Gecko's reactor also counts since you are ordered to shut it down; you can interpret it as fixing the coolant leak or running into the active zone and turning a big red valve to shut down the cooling system altogether, causing a nasty meltdown and forcing a whole city to relocate. Now then, why would anyone make the cooling system controlled by a single valve and more importantly, why would anyone place that valve into somewhere you can't reach it without being roasted alive?!
- There's a robot you can control in the Gecko reactor, so it's not all that mind-boggling. And the ghouls are immune to radiation, so again they wouldn't mind waltzing in and turning the valve.
- Fallout 3
- Project Purity: when you finally mow down the Enclave defending it, Li warns you that it was damaged in the fighting; the tanks are experiencing critical overpressure and the whole thing will explode unless the purifier is turned on to relieve the pressure. Thing is, another failsafe blew out and the control room has a fuckton of radiation inside so it's going to be a one-way trip. Earlier, you can read a message that Vault 87's GECK room has radiation purge systems but they can't deal with all that radiation coming from the outside and are constantly failing.
- Or the DLC "Broken Steel": two confirmations and no password protection whatsoever are NOT enough to prevent the Mobile Base Crawler from calling down an orbital strike on itself, is that too hard to realize? If you have a really determined foe on a Roaring Rampage of Revenge toting a BFG, a platoon of infantry with air support isn't foolproof either. Clue: the Enclave learned this the hard way.
- Fallout: New Vegas has one implemented due to carelessness before the bombs dropped in the Dead Money DLC. Before the bombs fell, the Sierra Madre was set up to broadcast a distress signal on its external radio antennas if something catastrophic happened. Unfortunately, since all the radios were set to broadcast music for the Grand Opening until after it was finished (three guesses as to what happened before the Grand Opening), the 'emergency' broadcast is still a broadcast inviting people to the Grand Opening Gala Event at the casino proper. Since the broadcast was designed to repeat until help arrived, this has led hundreds of unwary explorers to their deaths over the years...
- Brave Fencer Musashi: Whoever designed Steamwood is NOT an engineer. Any damage at all to it results in catastrophic pressure build-up which can only be released with eight separate valves on separate floors which have the most convoluted method of operation seen even in a video game. I'm surprised Grillin' Village is more than a steaming crater.
- In the Infocom game Suspended, the player wins a lottery to function as a fail-safe for the global weather control systems. Of course, when everything goes wrong, it turns out the player's robots are broken, the repair center is a mess, your provided documentation contains errors, and nobody actually told you how to find out what the problem is or how to fix it. Of course if you are taking too long to fix the problem and the casualties are piling up, actual repairmen will show up to deal with the issue. They start by turning you off — as it's assumed these failures could only occur if you were causing them.
- Might and Magic VIII plays with it. From your perspective, the failsafe itselfnote is the failure (as it will cause the destruction of the world, all for no gain). From the perspective of the ones that implemented the failsafe, the failsafe works perfectly, you just happen to be collateral damage (as the reason why the failsafe is there is to stop the Kreegan from subverting Escaton).
- In a twist of irony, the fail-safe prevention in the Panzer Dragoon universe is humanity itself. Various organisms, notably Coolias, have the inbred genetic potential to mutate into dragons, mentally overwritten by the Heresy Program to target and destroy the Towers and end the Ancients' terraforming plans. Unfortunately, dragons are seen as bad omens and abominations in general, and they are summarily executed by humans when discovered.
- In Five Nights at Freddy's, the doors to the office open when the power runs out. While you would want that to happen in every other instance, here it spells doom since there are four homicidal animatronics in the building with you.
- Sarge's quote in the quotes section from Red vs. Blue highlights his tendency to build machines with "failsafes" that end up backfiring on him somehow. For example, the bomb that he built into Lopez could be armed remotely, but Sarge designed it so that he himself couldn't disarm it, just in case he was captured and brainwashed into helping the Blues. Brilliant.
- Parodied in the Strong Bad Email "cliffhangers". Strong Bad, in his sci-fi persona of Space Captainface, orders his trusty engineer "Strap" Coopmore (the Cheat) to activate "the forward humbuckers" and prevent their ship from colliding with a comet. The Cheat points to a sign reading "The forward humbuckers have never worked."
- Riff's Mini Fission Comrade from Sluggy Freelance.
- Subverted here in Skin Horse. Genetically-engineered dog Sweetheart has taken refuge from an Alaskan snowstorm inside the team's V-22 aircraft (with an A.I. known as Nick). After cranking up her courage with a bottle of schnapps, Sweetheart is ready to go out and save the mission, only to be stopped by Nick. Due to the fail-safe big orange safety lever, that doesn't stop Sweetheart for long.
- In Freefall #2268 is an aversion; the failsafe works, but the user still complains.
Stupid computer! Security should not fail safe! Security should fail dangerous!
- In Larp Trek, Picard's mystery hinges on the idea that, while a holographic knife would dematerialize before piercing flesh, a real knife could be used by a hologram to stab someone. Geordi thinks, and then desperately hopes, the holodeck doesn't work that way.
- In Katamari, cousin Ichigo's attempt to turn off a potential Doomsday Device is halted by a rather important design flaw.
RoboKing: Wait patiently and watch as I earn my place in history through peerless cunning and technological prowess. Please ignore the fact that I couldn't build a fully functional on/off switch.
- Doctor Grordbort's Contrapulatronic Dingus Directory (a mock catalogue of Steam Punk rayguns and other non-existent devices of the scientific-romance era) is full of warnings about involuntary sterilization or the loss of "only some limbs" through mishandling of the devices, which is frightfully easy to do due to their needlessly complex controls and lousy human-engineering.
- In Ben 10: Secret of the Omnitrix, the titular device gets messed with in such a way as to cause it to start a countdown to an explosion that will destroy the universe. The subversion is that that is the failsafe. The creator figured that destroying the universe itself was better than having the thing fall into the wrong hands. Which raises a number of questions that have never really been resolved.
- Megas XLR:
- During the first Season Finale, Coop frantically searches his dashboard for a button that will save the world, only to discover that the button actually labeled "Save the World" was marked "Out of Order".
- A more serious example occurs when Megas' breaks its protonic stabilizer, a part that seems to be the ONLY thing that keeps its reactor core from immediately Going Critical with enough force to destroy a planet. A SCRAM or shutdown of the reactor is apparently impossible, as it's never mentioned as an option.
- In one episode of Archer, a computer virus infects the mainframe and threatens to upload all the spies' names to the virus' creator. They get the idea to just unplug the mainframe until everything can be sorted out, but it turns out the mainframe has a battery backup. Behind a nearly indestructible locked door. Whose lock is controlled by the mainframe.
- My Little Pony: Friendship Is Magic, season 2, episode 1 almost invokes this by name. Magical chaos is running wild. Twilight Sparkle, having seen this happen before (mainly from her own spells), has developed a failsafe spell for just this sort of occasion, and sees no reason to not use it at the first sign of major trouble. But this time, the source is stronger than she is used to, so...
Twilight: My failsafe spell...failed.
- In the Bugs Bunny cartoon Hare Lift, Bugs and Yosemite Sam are aboard a pilot-less aircraft. After an extended argument, Bugs rips out the plane's steering yoke. In response, Sam pushes a button marked "autopilot". A thin, beeping robot then emerges and upon seeing the condition of the plane's controls, immediately grabs one of two parachutes and jumps.
- In King of the Hill, Peggy goes skydiving, but both her chute and emergency chute fail to deploy. She ends up breaking several bones.
- A lot of the crises in Transformers Rescue Bots could have been avoided if the scientists in Griffin Rock ever bothered to implant failsafes in their tech.
- In 2017, Wanna Cry — a ransomware botnet — affected more than 20% of hospitals in the UK, later spreading to over 74 countries. The malware was designed to automatically ping an unregistered domain name, and would cease to exist if it found a domain name that actually existed. A twentysomething hacker found this flaw, registered a previously unregistered domain name, and began to intercept the botnet's packets. This registered domain's IP address reached all affected PC s... shutting down the virus. As Malware Tech, a British research firm, described it: "It thought it was in a sandbox [testing environment] and killed itself." Or, to put it another way, it was the cybersecurity equivalent of launching a missile that will only detonate if it hits a preassigned city ... and having the missile technician add "Mojave Desert" into the missile's launch code database.
- As recounted in the Seconds From Disaster documentary, the Gare de Lyon rail accident in 1988, where an inbound SNCF commuter train with disabled brakes crashed into a stationary outbound train, was the result of a horrible chain of errors and system failures that completely overwhelmed the existing failsafe mechanisms. Almost everything that possibly could have gone wrong, did.
- The Moorgate Tube crash of 1975 highlighted a major flaw in the London Underground's safety precautions, which were designed to withstand an inexperienced or careless driver engaging the brakes late but not to handle a train running towards the end of the line at full throttle. And for reasons never definitively established (the official verdict was rictus from a seizure but many believed suicide was more likely), the driver was still gripping the Dead Man Switch right up until the train hit the end of the tunnel.
- The Therac-25 radiation therapy machine is now used as an example in engineering textbooks of how not to design a safety-critical system. Through a combination of corporate negligence and incompetent design, it killed or maimed several patients with overdoses of radiation. The machine contained two radiation sources: one with low power for direct use, the other 100 times more powerful to be used only with diffusing hardware. A software module was intended to prevent human error from activating the high-power beam without all its accompanying hardware engaged... but pressing a key at just the right instant would crash that module and the operator would have no idea what had happened. Oooops. The Therac-25 disaster is used to demonstrate several basic design principles:
- Do not reuse existing software after hardware changes.
- Provide human operators with clear and significant error messages.
- Do not rely exclusively on software to verify hardware status.
- Have a clear reporting system for errors and accidents at the corporate and governmental levels.
- Darwin Award winners/losers often go to extreme lengths to override major failsafes in order to achieve minor objectives. Like this man, who had to try really hard before he could get run over. Or the winner who tried to unjam a woodchipper without turning it off first and Fargo-ified himself.
- The sinking of the Titanic on 15 April 1912 was so unexpected because of its novel failsafe design, with multiple watertight compartments that should have been able to keep it afloat even if one compartment was breached. The bulkheads that were supposed to seal off the compartments, while extending above the waterline, were not sealed at the top, meaning that they could still overflow and fill other compartments, something of a design flaw. And of course, the final failsafe on any ship—the lifeboats—failed to save most of the passengers, because there weren't enough of them. The Titanic had room to carry enough lifeboats, but it sailed with only one-third of its capacity. Contrary to urban legend, this wasn't due to the hubris of the designer and crew in believing it unsinkable. Carrying only a fraction of lifeboat capacity was standard practice at the time, based on assumptions about how slowly passenger ships sank—it was expected that help would arrive before the ship had to be completely evacuated and the lifeboats would simply ferry the passengers to the rescue vessels. The only ships that sailed with enough lifeboats for everybody were warships, which were expected to go down in conditions where they would sink fast. There were regulations on the bare-minimum number of lifeboats to be carried, but they were based around the weight of the ship, not passenger capacity, and the number aboard Titanic as-built was already over that limit.
- The Chernobyl Nuclear Power Plant had all the normal failsafes for its reactor design but operators had deliberately disabled many of them to test a new shutdown procedure.
- Worse, the operators were trying to increase output because the test wasn't working as anticipated. Even worse than that, the shutdown had been scheduled much earlier in the day but minor problems on the grid delayed it. Thus, the crew performing the test were not the ones who had been briefed on it. They mistakely reduced power, then overcorrected into a completely untested and unstable reactor mode... then decided to proceed with the test instead of just shutting down.
- The operator errors were exacerbated by several design features that were clearly less then ideal. For example, the control rods were poorly designed — in an emergency reactor shutdown or "scram", the control rods are dropped en masse to block neutron emissions and shut down the chain reaction. Because Chernobyl's control rods had graphite tips (the same material used to moderate the reaction in the first place), the scram caused a sudden power flare before damping the reaction. The Soviet Union was aware of the potential for power spikes, but previous spikes had always been brought under control and the problem never became a priority to fix. By the time the (unfortunately manually operated) scram button was finally pressed at Chernobyl, partial meltdown had already begun. When all of the graphite rod tips entered the chamber at once, the resulting power spike damaged the reactor vessel. The control rods broke off, leaving the reaction-boosting graphite tips lodged in the chamber and the actual control rods jammed and broken, at which point the reactor exploded. However, many other RBMK-design reactors were successfully operated for many years after the catastrophe, sugggesting that the design issues were not as big a factor in the disaster as human error.
- Britain's experimental Windscale nuclear reactor was simultaneously an example and an aversion of a Failsafe Failure. The reactor was constructed in order to give Britain parity with the United States in the nuclear arms race, but a combination of modifications to the reactor's operating procedures and incomplete understanding of graphite's response to nuclear bombardment resulted in situations where the reactor would periodically give off spikes of high heat, for which the original temperature monitoring equipment was woefully inadequate. It also lulled the operators into getting accustomed to seeing occasional high temperatures in the normal course of operation. Thus, when the reactor caught fire, it was at first thought nothing was wrong. It continued for three days at temperatures exceeding a thousand degrees centigrade whilst the temperature sensors - located away from the hotspots - reported normal operating conditions. The original design allowed for the uranium fuel cores to be pushed through their channels into a cooling bath, but by the time the fire had been discovered the cores were too hot to move. Not only had they become jammed by heat expansion, they were so hot that metal poles used to try and move them simply melted on contact. After several failed attempts to cool the reactor it was eventually brought under control by flooding the cores with water.
- However, the disaster could have had far more serious consequences. Windscale was air-cooled. Core temperature was kept under control with a series of fans, and the waste heat was exhausted into the air. On the suggestion of Nobel prize-winning nuclear pioneer Sir John Cockcroft the cooling towers were fitted with expensive, complex air filters, which were originally pooh-poohed on account of the work involved - the towers had already been constructed by the time Cockroft found out about them, and the filters were large, heavy structures that had to be built on top of the towers. As it turned out, the filters prevented the direct release of red-hot nuclear particulates into the environment, although the release of radiation was nonetheless substantial.
- On the Boeing 747's first flight, the backup batteries that would have powered the hydraulics in case of engine failure failed upon takeoff. Doesn't sound like much until you find out that the newly introduced high bypass engines were very finicky and the engineers had little idea whether they would stall upon takeoff due to the change in the angle of attack in the air (these engines stalled very easily - at the time a tailwind could easily lead to a stall). Engine stall = no power = no hydraulics = no control over flight surfaces = guaranteed crash = 700,000lb bomb loaded with jet fuel. Thus, they strapped on some batteries to power said hydraulics, but the batteries failed. Fortunately the first flight went according to plan. Source: Wide-Body: The Triumph of the 747 by Clive Irving.
- The tale of The Gimli Glider. On July 22, 1983, a combination of underfueling Air Canada's brand new Boeing 767 and a faulty fuel level sensor led to its pilots not knowing they were low on fuel until they ran out - at 41,000 feet in midair. To top it off, many of the instruments in the cockpit were electronic and designed to be powered by the jet's fuel - meaning that the pilots were flying blind and had no ability to control the aircraft. Fortunately for all concerned, there were a few battery-powered backup systems and a nearby decommissioned landing strip (the former Royal Canadian Air Force Station Gimli), and Captain Pearson happened to be an experienced glider pilot. The Gimli Glider managed to land safely (though it blew out a few tires on the landing gear and skidded to a stop on its nose) with no fatalites and only minor injuries. The Gimli Glider was repaired and flew for 25 more years until it was decommissioned in 2008.
- This is actually an aversion, as the 767's Ram Air Turbine, basically a small windmill emergency generator, deployed automatically and provided minimal power to the instruments until they finally lost too much airspeed before landing. That said, the evacuation slides did not deploy properly, which would actually be a minor example of this trope (the collapsed nosegear meant that the slide was at the wrong angle, leading to some of the aforementioned minor injuries).
- On Aug. 18, 2003, Hitoshi Nikaidoh, a surgical resident at St. Joseph Cristus Hospital in Houston, Texas, was decapitated by an elevator with faulty door failsafes. The car was supposed to be "out of order", but some jerk removed the sign. Did anybody think to cut the elevator's power?
- Pressurized or liquid gas cylinders are nasty things if not treated nicely. There are very good reasons why cylinders have pressure release valves and rupture disks, and why they shouldn't be diked out. As MythBusters demonstrated, a gas cylinder can punch a nice, clean hole through a cinder block wall (they built one for the test). In the process, the wall was shoved back noticeably and the wall behind said cinder block wall was nearly punched through itself.
- In the design of the space shuttle Challenger, the joints between booster rockets segments were sealed by two thin rubber o-rings, the second ring intended as a failsafe if the first ring failed to seal. The SRB design itself wasn't flawed; it worked fine — so long as you maintained mission parameters, didn't try to launch on a day below the recommended operating temperature, and didn't reuse parts that were obviously deteriorating. The Morton-Thiokol engineers who designed the SRB knew this and objected to NASA and their own upper management overriding their recommendations. Thanks to near-freezing temperatures at the January 28, 1986 launch, both rings failed to seal and were vaporized. Then there was nothing to stop the rocket's flame from burning away one of the booster's support brackets, the unsecured booster smashing nose-first into the external fuel tank, and the fuel tank rupturing from aerodynamic forces when the insufficiently secured booster smashed its nose into the tank. Also, pressure suits and the cockpit ejector seats had been discarded after the first few missions since crews of seven couldn't be ejected during launch. In the two-deck orbiter design only the pilot and commander could have ejected, and possibly the two crew seated behind. Because they were a deck down, other passengers (in Challenger's case, including teacher-in-space Christa McAuliffe) would have died anyway. By contrast the earlier Apollo and Mercury launchers were both equipped with escape towers — separate rocket systems that pull the crew capsule off of and away from a faulty booster rocket — while the side-by-side Gemini design allowed for ejection. And there have been two different Cosmonaut crews saved from certain death by the escape towers launching their Soyuz capsules away from an exploding rocket.
- The O-rings weren't even designed to act as failsafes for the rocket boosters. It was discovered that the stress of ignition tended to bend the joints between two segments of the booster, however the O-rings, heated by the igniting rocket fuel inside, would expand enough to cover the holes. As it seemed to work just fine, the design wasn't changed. However being as the lauch site was in Florida, no one thought to take into account the extreme cold's effect on the rubber o-rings. In fact the Morton-Thiokol engineers specifically told NASA on the morning of the launch that they had no idea how the cold would effect the O-rings (as the launch had already been delayed several times, NASA put pressure on them to give the okay). So in a cascading series of improbable events, the joint bent under the stress of ignition while the O-rings, which had frozen overnight, failed to heat and expand fast enough to seal the gap and were instead vaporized.
- The Soviet Buran shuttle was launched some two years after the Challenger catastrophe, but included the ejection seats for the whole crew right from the start of the design process. In fact Soviet designers have long (and quite vocally) criticized the Space Shuttle cockpit layout, calling it a throwback to the WWII era bomber cockpits, and made a point of putting all crew seats on the same deck. Had the Shuttle used the same layout, at least several crewmembers could've possibly been saved.
- The crew of Soyuz 11 died from a leaking pressure valve during reentry. During descent, the explosive bolts which attached the service module to the descent module were fired simultaneously instead of sequentially. This damaged the pressure valve which was supposed to equalize pressure inside the module once they entered Earth's atmosphere, causing it to open while the module was still in space. There was a manual override for the valve, but it was located underneath the seats making it almost impossible to find in an emergency. Later a cosmonaut on the ground attempted to close the valve himself and found it took over a minute to do so, while the biometric sensors on one of the cosmonauts showed 40 seconds had elapsed between loss of pressure and death, in reality oxygen deprivation would have disabled the cosmonauts in far less time, approximately 15-20 seconds.
- Apollo 1 and Liberty Bell 7. The latter case came first.
- After Virgil Grissom's Mercury capsule splashed down, the explosive bolts on the hatch (which were there for emergency egress purposes) went off, allowing water to rush into the tiny capsule and sink it. Grissom was very much in danger of drowning, as the crews of the recovery helicopters were trained too well and had to realize the capsule was a lost cause before finally realizing the astronaut was also having difficulty staying afloat and rescuing him. After losing a spacecraft, NASA decided explosive bolts were a bad idea, and did not incorporate them into subsequent designs, instead opting for doors which required much more deliberate effort to open. On the Apollo 1 spacecraft, the door opened inward. They claimed this was safer (And in a fundamental way, it is. Pressure doors are supposed to open in the direction of positive pressure so that this pressure works with the door to keep it closed under normal conditions. That's why airliner doors always open inward while submarine hatches always open outward). While the crew of Apollo 1 was doing a test on the ground, a fire started in the capsule. The atmosphere inside was pure oxygen at greater than sea level pressure (in case you flunked chemistry, pure oxygen cranks any flame in it Up to Eleven), and NASA had also managed to put all kinds of flammable materials in the cockpit. As the fire rapidly grew, the pressure grew inside the cockpit, making it impossible for any human being to open the inward-opening door, which, due to the lack of explosive bolts, could not be blown open. Smoke and fire turned the cockpit into a fiery tomb from which there was no escape, and all three astronauts died. One of these astronauts was Virgil Grissom.
- The fact that the cabin was pressurized to about 2 atmospheres, to reflect net outward pressure the capsule would experience in space, was also a major contributory factor. It meant there was both far more oxygen available to accelerate the fire and that an inward opening door was simply physically impossible to open until the oxygen pressure was reduced. Just as there were no emergency bolts, there was also no means for depressurizing in time. There did exist a means for depressurizing rapidly, called the cabin repress valve, there was just no times to use it. According to the Apollo One Fire Timeline, the crew noticed the fire (by verbal report) at 23:31:04.7. The Command Module ruptured due to pressure at 23:31:19.4, less than 15 seconds later. During the investigation, they determined that had the repress valve been opened, it would have delayed the rupture by about one second. Even if they could have instantly flushed the atmosphere, the interior surfaces had foam padding that was going to be removed prior to an actual launch protecting bulkheads and side panels from being scuffed and dinged during ground testing. After having been soak in 2 atmosphere pure oxygen for 3+ hours, the foam would have burned like napalm in hard vacuum.
- A similar story happened in the Soviet program too, but wasn't really a case of the failsafe failure — only the crew error. The cosmonaut on a week-long test in an oxygen chamber decided to brew himself some tea and turned on a hot plate. As he was scheduled to have some medical tests taken that day, he needed to clean and disinfect the electrodes' attachment points on his skin, which he did with an alcohol-soaked cotton swab, which he then proceeded to unthinkingly throw in the general direction of the garbage bin. Unfortunately the swab landed right onto the hot plate, and in a pure oxygen atmosphere of a chamber combusted immediately, starting a major fire. Due to the design of the chamber door it was opened only some 20 minutes later, when the cosmonaut in question, Valentin Bondarenko, already sustained third degree burns, from which he died a couple days later.
- The Apollo 13 Failsafe Failure was even more spectacular (the fact that NASA managed to bring the command module home with all three men alive is often considered NASA's Crowning Moment of Awesome). It was a whole series of Failsafe Failures.
- The faulty oxygen tank on Apollo 13 was previously installed on Apollo 10, but was removed and sent back to the factory because of a design change. It was jarred during removal because someone forgot to remove a screw that was holding it in place, causing it to be pulled up a few inches and dropped by a machine arm. This knocked the drainage tube in the tank (which was also part of the electronic level gauge) out of alignment, which prevented the gauge from properly indicating when the tank was empty. The tank contained an electrical heating coil that could be turned on to heat the oxygen inside for use in flight, and a pair of fans that were used to stir the contents of the tank to get an accurate level reading in ballistic (zero-g) flight. During a test run, because the gauge wasn't working properly, technicians believed the tank wasn't draining so they turned the heater on. There were 2 thermostats on the heater which should have opened and broken the electrical circuit if it got too hot, but the tank circuits were designed to run on 28 volts, provided by the on-board fuel cells during flight. Since this was a ground test, the heater was powered by the ground support equipment which runs at 65 volts. The thermostats were rated for 30 volts maximum, and the too-high voltage welded the contacts shut. There was also a human watching an outside temperature gauge that registered the heat inside the tank, but the gauge was only designed to go up to 80 degrees Fahrenheit (which was about 200 degrees hotter than the sub-zero oxygen would be stored in). Since the needle never went above 80, he didn't realize the tank was getting up to 1,000 degrees inside, and as a result the insulation coating the electrical wires inside the tank melted... which left them vulnerable to short-circuit and sparking. Apollo 13 was launched on schedule with the faulty tank, and four days into the flight, they flipped on the stirring fans in the tank, the damaged wiring sparked and ignited the tattered remains of insulation, and fire inside the tank promptly exploded it.
- To make matters even worse, the Apollo craft carried two oxygen tanks for extra safety, but they shared some plumbing, and when tank #2 exploded, it took several critical parts of tank #1's plumbing with it. Result: the oxygen in both tanks was soon lost, and the astronauts inside would have died in a few hours if they hadn't been carrying a healthy lunar module with its own oxygen supply.
- At least the nuclear fuel rod cask was fine. A miniature nuclear pile was built to power some instruments that were to be left permanently on the moon, but just in case the mission never got to the moon, a ceramic cask was built to contain the nuclear fuel, and it was designed to survive a fiery reentry to Earth... just in case. And survive it did.
- The SNAP-27 carried on Apollo 13 (and Apollos 12, 14, 15, 16, and 17) was a Radioisotope Thermoelectric Generators, essentially an atomic battery, not a reactor (there's no chain reaction fission going on, that's the clue). It basically turns the heat of spontaneous radioactive decay into electricity via thermocouples. It is similar in design to the RTGs carried on Pioneer, Voyager, Cassini, and the New Horizons probes. They've also been used to power lighthouses, Antarctic science experiments, and anywhere you need a decade's reliable power source. It was probably one of the most reliable components flying on Apollo 13. Even if the cask had ruptured during reentry, the plutonium inside would have remained intact and ended up at the bottom of the same 20,000 ft. deep trench, doing absolutely no harm to anyone for the next 5,000 years.
- As shown on MythBusters: plugged safety valve on water heater + thermostat failure = steam-powered ballistic missile. As reported all over the news, one such incident occurred in a strip mall in Burien, WA on July 28, 2001.
- Part of the start up procedure for an industrial or commercial boiler (a water heater is technically considered a type of boiler) involves getting the pressure (steam) or temperature (water) high enough to make the safety valves lift. If they don't, you shut it down. The things that hold them closed are set to lift automatically at the maximum working pressure/temperature. Broken gauges and sensors can still result in one exploding during startup if the valves (which don't require power) are broken because it won't shut down automatically and the person doing the startup won't know that there is a problem until it is too late.
- The original DC-10 airliner cargo door fault that caused the 1974 Paris disaster. Firstly, the cargo door opened outwards, as opposed to inwards. This meant that air pressure inside the plane would naturally try to force it open, requiring a complex set of locking hinges and pins to keep it closed. Secondly, the door handle was supposed to be impossible to close unless all the pins were safely latched, but in practice, if enough force was applied to the handle, the internal mechanisms would bend out of shape without latching. So, the door could still appear to be closed and locked even when it wasn't. Thirdly, warning placards to inform the ground crew of the potential problem were installed, but they were only in English, which most ground crews around the world couldn't read. And finally, when the door blew out, the pressure change collapsed the cabin floor and severed all of the aircraft's control lines, including the redundant backups, rendering the pilots helpless. Airliners have floor vents to prevent such a catastrophic failure, and the DC-10 DID have floor vents, just not in the area of the cargo door, for some reason.
- And the reason that there were even warning placards in the first place? In 1972 there was a American Airlines flight that had the exact same problem, but because there was no one on the seats that were on the section of floor that collapsed, it cut some of the control cables leaving just enough control to land the plane with no loss of life. Because the only way that the FAA could force McDonnell Douglas to fix the planes was to ground them all and not let them fly before the door was repaired, there was a gentlemen's agreement between the head of the FAA and McDonnell Douglas to fix the doors without forcing them to ground all their planes.
- There were floor vents in all the cargo compartments of the DC-10 - they simply were too small and the pressure in the cabin and in the cargo compartment didn't equalize fast enough to prevent the floor from collapsing.
- Modern Formula One cars have anti-stall systems in the engine management computer. These are very useful as long as they don't go off accidentally on the starting grid and put the car into neutral when it ideally should be in first. This is more embarrassing than dangerous though.
- The system is capable of forcing the car to continue moving when the driver attempts to stop. This caused test driver María de Villota to crash into a stationary truck, suffering serious injuries that may have lead to her death 18 months later.
- HMS Ark Royal sank after being hit by a torpedo that, among other things, caused flooding that shut down the boiler which powered the emergency pumps and all the electrical generators, the ship having been built without dedicated emergency generators separate from the main system. Oops. (Poorly engineered and inadequate electrical systems were a "feature" of all Royal Navy ships of the period because their urgent need to rearm arose right when the Great Depression was making R&D funds hard to come by.)
- USS Enterprise (the fifth one, CV-6) had a steering engine breakdown in the middle of one of the carrier battles for Guadalcanal, jamming the rudder into a hard turn. Fortunately the crew had rigged an emergency steering engine in case this very thing happened. Unfortunately, everyone in the compartment with the backup was knocked out by toxic gas released from nearby fires. It took nearly thirty minutes for someone to reach the compartment, and before he could turn on the backup motor he passed out as well; fortunately he came to fifteen minutes later and managed to turn on the backup motor. While all this was going on, another Japanese air raid was detected approaching but turned away fifty miles out.
- The Japanese carriers at Midway had their emergency generators for the firefighting system just off the upper hanger deck, about at the midships line. This placed them on the same deck where any bomb that struck the carrier would probably explode, at about the spot enemy pilots would use as aiming point. At least two of them probably lost the backup generators to shrapnel from exploding bombs.
- Similarly, the British Navy during World War I had one of the safest and most efficient systems of transferring explosives from turrets to magazines. Unfortunately, Admiral David Beatty of the Battle Cruiser Fleet thought that they weren't efficient enough, and so decided not to use them, unlike his superior and commander of the Grand Fleet Admiral Jellicoe. The result? At the Battle of Jutland, both the battlecruisers of the Battle Cruiser Fleet and the dreadnoughts of the Grand Fleet sustained similar hits. But while the dreadnoughts stood up beautifully, firing back and damaging several of their German counterparts so badly that they were effectively forced out the war, three battlecruisers exploded in as many minutes.
- Inversion: Electrical codes require failsafe protection (fuses or circuit breakers, for example) to be on all circuits, to stop the current flow in the circuit when the wire gets hot enough to possibly catch on fire. Aspiring electricians will have the failsafe rules for preventing electrical fires hammered into their heads repeatedly (electrical fires being as much as if not more of a danger than electrical shock). So it is jarring at first, to learn that circuits for fire pumps MUST NOT have any fuses or circuit breakers of any kind. Why? If the fire pump is running, it is assumed there is already a fire, and a fuse or breaker breaking the circuit (and shutting off the pump) isn't going to improve the situation.
- Probably applicable only to the American grids, which have a peculiar system where the neutral wire is isolated from the ground. European grids have the neutral grounded, so short circuits do not usually propagate for a large distance, making this requirement somewhat irrelevant.
- One result of this is that some American three-phase electrical outlets have up to five electrical contacts, one for each of the three alternating-current phases, one for the un-grounded neutral wire, and one for the ground-wire.
- The grounding wire is a fail safe that's connected usually to the chassis of the object in the event that short circuits dump to it and not across your hand or anything contacting it. However, you can "disable it" by getting a three-prong to two-prong adapter (ironically the two-prong adapter has a tab so you can connect a grounding wire to it... but nobody does anyway).
- Zinsco brand circuit breakers, installed in countless homes from the 1950s to the 1980s, were infamous for spontaneously arc-welding themselves into the "on" position, leading to thermal runaway and structural fire in the event of an overload or short. Federal Pacific Electric breakers were similarly non-UL-compliant and could randomly fail to trip after years of seemingly smooth operation.
- Probably applicable only to the American grids, which have a peculiar system where the neutral wire is isolated from the ground. European grids have the neutral grounded, so short circuits do not usually propagate for a large distance, making this requirement somewhat irrelevant.
- One of the main causes of the Three Mile Island nuclear accident was a pressure relief valve sticking open. At first the dangerous pressure is relieved — and then the coolant keeps escaping through the stuck-open valve.
- Which the operators would've noticed, if the indicator light had been connected to the valve itself rather than the switch that controlled the valve.
- Adding to the problems there, the plant was being operated with several alarm lights permanently locked on due to some type of failure in the system causing the alarms to read false, and instead of fixing the issue, they just ignored them. So when the things which those alarms were supposed to be monitoring actually reached the alarm point... no-one knew.
- Ironically, the manager on duty at the time of the accident had gone to see a movie the night before... The China Syndrome, which is about safety coverups at a nuclear power plant, complete with a near-meltdown situation.
- Another problem was the control room was equipped with more than 120 separate dials, alarms, and gauges, making it very difficult to isolate the root cause when the accident triggered virtually all of them at once.
- The SL-1 reactor, site of the only fatality directly caused by a nuclear incident in the US. It was built for and run by the US Army as a prototype for a small, semi-portable reactor to power mobile command centers. A technician was performing a maintenance test on it while it was shut down. Said test required him to manually elevate the reactor's only control rod a few inches. He raised it up almost 2 feet. The reactor became instantly active and went prompt criticalnote , the sudden power spike caused the water in the reactor to superheat and flash to steam, and the pressure surge ejected the control rod, which impaled the technician on the roof of the compartment. Luckily the other failsafes that weren't violated/ignored to do this kicked in and shut down the reactor, but not before the other two people at the site were killed by the explosion- while also receiving enough radiation to require all three to be buried in lead-lined coffins entombed in cement.
- The Deepwater Horizon oil spill in the Gulf of Mexico occurred because the blowout preventer, a supposedly idiot-proof device that seals the pipe in the event of something like, say, a rig explosion, failed. It turns out that the device had been tampered with (one of the rams that would have sealed the pipe was taken out to make room for some kind of monitoring equipment, amongst other things) but it's still a great example.
- It gets better. That was the backup device. Someone noticed a problem with the primary during some tests, hence Transocean fitted the monitoring kit and said "Oh, don't worry, the backup will take care of it." Predictably, when it was called upon, the backup failed.
- IT STILL GETS EVEN BETTER, there has been a rumor that the alarm was turned off so false alarms wouldn't wake people up. No wonder 11 people died.
- BP's safety record is one of the worst in that regard, in that disabling the failsafes and monitors to increase productivity seems to be a SOP for the company. For example, the earlier Texas City Refinery Explosion occurred partly because someone disabled an overflow alarm, which, when the other one broke, started a chain reaction that killed 17 people.
- TransOcean has an even worse safety record, actually citing the year of the spill as their best year ever and their least number of accidents.
- It gets better. That was the backup device. Someone noticed a problem with the primary during some tests, hence Transocean fitted the monitoring kit and said "Oh, don't worry, the backup will take care of it." Predictably, when it was called upon, the backup failed.
- The Big Bayou Canot train wreck of 1993 happened because a barge struck a railroad bridge hard enough to kink the tracks, but not hard enough to actually break them, which would have set off warning signals and stopped the train.
- Cancer is an example of a failsafe failure of a failsafe failure of a failsafe failure of a failsafe failure. Precancerous cells are a normal occurrence in the human body due to imperfections in DNA replication. Fortunately human cells have proliferation control mechanisms, failures in these systems can cause inappropriate cell division. On top of this, cells have other control mechanisms such as cell contact signals that stop cell division, cell survival signals, immuno-detection of cancerous cells, telomeres limiting the number of cell replications. These other failsafes are overcome by natural selection and the law of large numbers though further compounding mutations during DNA replication in subsequent generations of the cell line.
- Alzheimer's meanwhile is (to some extent) a problem because the failsafe works too well: the brain is equipped with so many tiers of redundancies and backups that it can suffer a huge amount of neural degradation before the person's everyday performance is noticeably affected - but this means that by the time the symptoms are obvious, it's because everything that can be done already has been and the battle is mostly lost. (Less of an issue in practice because no one knows - yet - how to cure Alzheimer's anyway even if it's caught early.)
- The fantastically elaborate design of the stuxnet worm managed to override every safety system used to ensure that the gas centrifuges at the Natanz nuclear facility couldn't malfunction. The whole system, from the Windows operating system of the controlling workstation to the SCADA PLD controlling the speed of the centrifuges was essentially taken over by the worm. The worm even managed to make the SCADA system "lie" to the computer connected to it by playing back the data from a normal run, à la Speed, as the PLD caused the centrifuges to spin out of control. This was caused deliberately of course, but it shows that human ingenuity, as well as human stupidity, can override failsafe systems.
- Which is the reason why any reactor is not in any way connected to the outside world other than maybe a simple telephone (which is a separate entity as well to be extra sure).
- Thanks to USB drives and the genius at Microsoft who thought "let's allow plug and play media to run programs as soon as inserted without the user knowing" that's not a problem.
- While that was a dumb design flaw in Windows (since fixed), ask yourself why computers in that sensitive facility even had USB ports? The ports should have been plugged, or otherwise physically disconnected...except those ports were needed to accept the PLC program code which was then transferred to the SCADA units themselves. The SCADA units were physically airgapped but needed a way to receive programming. The worm tampered with the PLC code on the Windows computer before it was sent to the USB drives and from there to the SCADA units. Thus Stuxnet becomes one of the rarest types of malware: able to use Sneakernet to jump an air gap.
- The Stuxnet worm is a bit of a special case, as it was not only designed to cause the system to fail but, well, TV Tropes is not necessarily saying it was designed with the help of the company who built the centrifuges...
- The failsafes in the Fukushima Daiichi (Fukushima I, that's a roman numeral one) nuclear power plant worked as intended after the 11 March 2011 earthquake in Japan, safely stopping all three of its working reactors. But then the tsunami came and washed out all the emergency diesel generators that some genius placed right at the shore, and the plant's connection to the grid was severed by the quake. So the plant lost cooling at all of its six reactors, which led to the successive meltdowns of at least three of them.
- What happened to the diesel generators is known as a "common mode failure" in engineering circles, and it's one of the hardest hazards to anticipate and prepare for.
- The proposed Molten Salt Reactor design uses fuel that has to be in a melted state for the reactor to work, and keeps the core from draining by constantly cooling a flattened section of pipe to keep a plug of salt frozen so that if power to the reactor's building is lost, the plug of salt melts and the fuel-salt drains into a passively-cooled drain-tank that's configured to prevent nuclear fission from happening, averting this trope in the simplest possible way: powering the safety system entirely by the force of gravity.
- Even that might have been dealt with because of another safety measure designed in: the reactor power system had the capability of getting electricity from truck-mounted mobile generators, which were, in fact, on scene within a few hours. Only problem was that no one had verified that the generators and the power system they were supposed to supply emergency power for had compatible attachments for the power cables.
- What happened to the diesel generators is known as a "common mode failure" in engineering circles, and it's one of the hardest hazards to anticipate and prepare for.
- The Russian submarine Kursk sank due to one such design flaw: a faulty torpedo was able to leak hydrogen peroxide, which proceeded to react with the torpedo casing, causing the first explosion which then set off the fuel and munitions in the torpedo bay. There are arguably several flaws in the design that let this happen (starting with explosions ideally not being the immediate consequence of a leaky pipe).
- Even this would not have been a problem had there not been a ventilation duct through the bulkhead. The bulkhead itself was strong enough to contain the explosion of the first torpedo (which itself had not had its welds tested since it was only a dud and thus had no warhead), but the vent was not, and thus allowed the blast to get through, and incapacitate everyone in the command room in the second compartment. The torpedo tube door itself should also have contained the blast, but it was a common issue that the doors often required several tries to properly close due to bad contacts. Finally the emergency buoy which should have automatically released in the event of a catastrophic failure was disabled after concerns during a previous mission in the Mediterranean that it would trigger accidentally and give away the submarine's position.
- Almost exactly the same accident befell HMS Sidon in 1955 (except that the rest of the crew were able to evacuate), and resulted in the British Navy dropping that torpedo design several decades earlier. Russia was basically either lucky or unlucky enough (depending on your point of view) to avoid similar incidents during the 20th century.
- The power outage during Super Bowl XLVII was ironically caused by the failsafe itself. A power relay, which was supposed to activate and relay power from another source in case of an outage, activated when it wasn't supposed to, causing a partial blackout in the stadium.
- Stories like these are generally mass circulated in maintenance circles as a reminder for safety and reason for preemptive inspection. Verifying the numerous safety pins and indicators in an ejector seat before even touching it because a system capable of sending a grown man eighty feet in the air in under two seconds doesn't leave much behind if the aircraft is inside a hanger with a forty-foot ceiling. Overpressurizing an aircraft system has caused an aircraft to crack in half. Altering the bell mouth of a fuel manifold so it doesn't knock against a back nut for a fuselage panel that was preventing it from forming a vacuum seal and caving in the manifold.
- sudo (in POSIX style operating systems) and UAC (in Windows) is a failsafe, sort of, to make the user aware that whatever they're going to do may have an impact on the system. When used, most programs that try to do an action that will cause system changes will trigger this. Bypassing it, by either running as root or disabling UAC (or even leaving the latter at its default "recommended" setting), will allow a program to do whatever it wants.
- An example of a double failsafe failure. After several tragic mid-air collisions (most notably over India in November 1996, claiming 349 lives) a system was introduced to prevent such collisions from happening: all airliners were equipped with TCAS (Traffic and Collision Avoidance System), detecting other aircraft on a collision course and telling the crew if they should climb or descend to avoid a crash. Then, in January 2001, two Japan Airlines aircraft with a total of 677 lives onboard nearly collided over Suruga Bay, even though both of them were equipped with properly working TCAS - because at the same time the crews received commands from the ATC controller to avoid a collision, and those ATC commands were contradictory with the TCAS commands: one of the crew followed TCAS and disregarded ATC, the other followed ATC and disregarded TCAS; the two aircraft passed each other by 35 feet (11 metres). The Japanese investigation commission asked the ICAO (International Civil Aviation Organization) to establish a clear set of rules whose commands have a priority in such situation. For some reason though, the ICAO ignored the pleas and seventeen months later, two aircraft collided over southern Germany for exactly the same reasons, killing 70. Only then were the TCAS commands were given absolute priority.
- Over the course of WWII, the Germans attempted to add failsafes to the bombs they dropped on England in order to create more and more complicated Wire Dilemma situations that would hopefully kill British bomb disposal technicians. Occasional instances of this trope allowed the British to safely disassemble the new bomb types and figure out how the failsafes worked and from there how to disarm the bombs when they weren't broken. Of course, in this case, the feature that failed was a fail-deadly, the failure of which caused the bomb to fail-safe, which is to say the bomb didn't work.
- Probably the worst known roller coaster incident in history occurred on the Battersea Funfair Big Dipper on 30 May 1972. The lift chain malfunctioned, followed by the anti-rollback mechanism, and the train rolled back to the station and collided with the other train. Five children died and another 13 were injured. The park struggled on for another 16 months, before closing at the end of the 1974 season.
- The Smiler roller coaster at Alton Towers suffered a bad incident (although not as bad as the Battersea incident) on 2 June 2015; the operator, following standard procedure, sent a test train around the track, but assumed that it had completed the circuit instead of checking that it had. He then sent a passenger train, and the ride safety systems detected the impending collision and shut down the ride — and the operator assumed that this was a malfunction in the safety systems, and manually restarted the ride without doing any further checking, probably using a key he wasn't supposed to have.
- Class D cargo holds were designed to be airtight in order to starve cargo fires of oxygen, preventing them from bringing down a plane. However, in the case of ValuJet Flight 592, the fire was caused by incorrectly declared, unsafely packaged oxygen generators, resulting in a self-sustaining inferno that brought the DC-9 down within minutes, killing all 110 occupants. Class D holds were discontinued after the accident because the FAA realised how useless hoping a cargo fire would peter out was.
- Automatic activation devices are lifesaving devices in skydiving, and they are intended to automatically deploy the reserve parachute if the main has failed or is not opened and flying at certain altitude. They have saved hundreds of lives, but sometimes they can fail disastrously. As they register the air pressure changes and acceleration, they can sometimes fire unintentionally if the skydiver tries something daring at very low altitudes, resulting in both main and reserve canopies flying simultaneously. There are three possible malfunctions: biplane, where the canopies are one above each other, side-by-side where the canopies are aside each other and downplane where they are vertically aside each other, creating no lift. Biplane and side-by-side are mere nasty nuisances, but the downplane malfunction is likely to be fatal unless the skydiver a) has enough altitude and b) manages to perform immediately main cutaway. And pray...
- This is believed to have contributed to the Hinton train collision in Canada in 1986. A CN freight train ran a red signal and collided head-on with a passenger train, killing 23 people, including the crew in the engine. While the lead locomotive was equipped with a "dead man's pedal," the subsequent investigation found that it was common practice for CN crew to keep the pedal depressed with a heavy object so they didn't need to keep their feet on it. The engineer was found to have a number of health problems that put him at high risk for a heart attack or stroke. It also didn't help that the crew was also suffering from a severe lack of sleep due to the shifting train schedules. One possible explanation is that the engineer was incapacitated and with the dead man's pedal depressed, the train kept running when it should have stopped. Ironically, the second engine had a newer reset safety control that didn't require engineers to keep their feet on the pedal, but it wasn't used because the cab wasn't as comfortable. After the accident, the railroad industry moved toward these safer controls.
- Modern internal combustion engines with multi-gear transmissions have at least two failsafes to protect the engine, transmission and/or drivetrain from catastrophic overspeed failures. The first is a governer that automatically shuts off the fuel flow at or slightly above redline. The second is a set of interlocks in the transmission meant to prevent gear changes from placing the engine, transmission, and/or drivetrain into a state that will either severely damage or destroy any part of them. These are not new technologies. The former has been around since the 1940s (early gas turbines had them) and maybe even earlier. The latter was a standard feature of mid-80s Honda 5-speed manual transmissions (5th to reverse wasn't possible without stopping at neutral along the way). If both fail you probably just totaled your car.