Darth Wiki / Idiot Programming

"I love lag, especially when you have eight processors and no excuse!"

You'd think it would be simple to program something to do something simple. Despite what some people may tell you, you're often right. In a perfect world, we wouldn't have programmers who constantly assumed that any simple task is a gargantuan effort that requires the importation of several processor-heavy, 100-megabyte libraries just to set up. In a perfect world, machines capable of performing 2 billion complex calculations per second wouldn't be brought to their knees by spin-locks and memory leaks. And, in a perfect world, these programmers, and the manufacturers who made these devices, would all be bankrupt.

This is not a perfect world.

The other side of this is, of course, Genius Programming.

See also Artificial Stupidity, Game-Breaking Bug (for video games).

In general, a problem in a piece of software shouldn't be considered an example if it occurs in a pre-release (pre-alpha, alpha and beta) build, or if there's no reason to believe that the software should be judged by professional standards (i.e. commercial software is fair game, but non-commercial software often isn't, unless it's intended to compete with commercial software).

There are also a few things that often aren't examples, even though they might look like they are — in particular, software that appears to use a lot of memory or storage space on your machine. Your operating system might use a gigabyte of memory on your machine, but that's not because it actually needs it — what's actually happening is that it's using extra memory in exchange for a speed boost. Even the most resource-hungry consumer OSes used today can run on dinosaurs (although finding such a rig may prove difficult).

Some galling examples can be found on The Daily WTF (here), particularly the CodeSOD section.
    open/close all folders 

Brands With Their Own Folders

    Adobe – Its Name Is Mud, Literally. 
Look on the bright side, Adobe software can be used to make satires of Adobe itself.
  • Adobe Flash. You may notice it on this very site, taking up 100% of your CPU and 80 megabytes of your RAM to display a static image that would take up 12K as a JPG. Also seen on numerous video sites, as a player that drags brand new multicore, multigigahertz computers to their knees in order to jerkily fail playing h.264 video that would run silky smooth on Pentium IIs or G3s as unwrapped files. Thank heavens for Flashblock. And also praise the Builder that YouTube videos can be streamed into external video players, such as VLC or SMPlayer with its very own Youtube browser. It's not perfect, but it is great for bypassing the browser-bloat on single-core CPU's, or simply to save on CPU for multi-tasking. This is far more efficient as it also strongly uses video acceleration hardware AND only needs 1/4 to 1/2 of the CPU on a Pentium IV 2.8 GHZ system. Also, memory use is almost non-existent.

    Even simple programs, like Conway's Game of Life, which can easily run at 70 frames per second on a 486SX when written in C will struggle to run at 5 frames per minute when the same code is used in Flash on a computer more than a hundred times faster. And sometimes, the already-poor performance of Flash is compounded by the often badly coded applications written for it. To give an example, The BBC embeds audio and/or video files in pretty much every article on the BBC News website. Unfortunately the initial version of the Flash app they used to do this was so badly designed that any system with a processor below a Core i7 was pretty much guaranteed to be utterly brought to its knees for several minutes at a time while the player loaded. It took months for the app's performance problem to be fixed.

    A couple versions ago, the Windows Flash installer would sometimes report insufficient disk space even when there was no such problem. The reason? The installer would check drive C for space, regardless of the actual destination drive, and even if C wasn't assigned to a hard drive at all. Compounding all these problems is the fact that Adobe appears to be deliberately crippling Flash in its capacity to perform its original purpose — vector-based animation — to try and get people to use it for what they want, which seems to be websites (hands up, everyone who thinks this sounds reasonable. Anyone? No one? Good).

    Flash has issues on the security front as well. For example, in early 2015, over a two week period, Flash had three zero day exploits appear and attack unsuspecting users. As each one was patched, the next one appeared to make people's lives more miserable. There is a movement that Adobe is passively supporting called "Occupy Flash" that is pleading with users to uninstall the plug-in; with the rise of HTLM5 content, this is quite feasible for certain people.

    In July 14, 2015, Mozilla announced that all versions of Flash were now blocked by default in Firefox, citing Adobe's slow response to patching publicly available security exploits, to the delight of tech people the world over. However, Adobe soon issued patches, and Flash was unblocked — though the general feeling is that this incident added another nail to a coffin that's been a long time coming. YouTube has "given the finger" to Flash as well, and currently uses its own HTML5 player by default. The memory consumption is reasonable for today's budget-computers (Budget Pentium, and AMD APU systems for example) considering it's running the full-featured YouTube site. Also, it, unlike the Flash player, supports 60 frames per second as well as playback speed options. A good omen, perhaps.
  • Adobe Acrobat and Adobe Reader aren't much better either. Somehow, Ctrl+C to copy doesn't always work, despite the fact that this is one of the most basic features of any program that can display text. Sometimes it works, sometimes it silently fails, sometimes it fails and pops up an error message saying "An internal error has occurred", and sometimes it works but still pops up that error. And if you hit Ctrl+C four times in a row in a futile attempt to copy the same text, you might get one of each result despite the fact that nothing else changed between attempts. It's like playing a slot machine.

    Then there's the fact that the installer likes to add bloatware to the system in the form of "helper" programs that start with Windows, stay running as background tasks, and which the main program runs fine without, and the equally unneeded "Adobe AIR". Google searches for "Adobe Reader without air" are very common.

    Also, Linux versions were somewhat troublesome. Sometimes, when having several documents opened, you cannot change between them using the mouse — same for using the menus. You have to "unlock" it from the document, by either clicking on the document, or using the keyboard to activate the menus. Adobe eventually stopped developing Adobe Reader for Linux, but that's not really a bad thing when less bloated PDF viewers, like Evince and Okular, exist (especially since those two also support other formats).

    Early editions of Adobe Reader X would take forever to display a single PDF, take up huge amounts of processing power, and even had parts of the menu and toolbars disappear at random.
  • Adobe Dreamweaver is known to crash when trying to load a file with a size that's an exact multiple of 8,192 bytes. This is actually a known issue and documented on Adobe's official support forums. The recommended solution? Open up the file in a text editor and add a few characters to change the file size.
  • Despite Adobe Photoshop having been the standard tool for professional digital artists for years, later releases and developments have their increasingly disillusioned artist userbase investigating the products of competitors and open source projects.

    For starters, the company has begun insisting upon billing the program- and going to great lengths optimizing it to be- an idiot-proofed meme-generator and photo-sharpener targeted toward the lucrative "casual users with way too much money to burn who think Photoshop is magic" market. As a result, the "exciting new features" added to Photoshop are frequently things like yet another way to remove red-eye, while basic glitches and quirks that artists have been wanting fixed for years remain ignored...

    These problems include things like the ability to freely scroll the canvas in certain modes note , failure to update key aspects of the program like the "liquify" tool to modern 64 bit multicore operation, and implementation of fundamental industry-standard utilities like total keybinding control, tiled windows in full screen mode, and changing the selected tool when the user changes tablet stylusesnote .

    The best part? Those aforementioned "exciting new features" occasionally displacing, complicating, and/or generally eroding the performance of the Genius Programming aspects of the program that endeared it to artists in the first place. Meanwhile, Adobe seems baffled by the fact that many artists have expressed a complete disinterest in trading their already-paid-for program for a rented one that introduces more such issues on a regular basis.

    Apple - It Just Doesn't Work 
  • Apple products, especially iTunes, have a habit of downloading updates, installing them, then leaving behind a tens or even hundreds of megabytes (per update!) worth of temporary files which it doesn't clean up. Even if you update your iPod firmware, it'll leave behind the temporary files on the computer you used to update it. To add insult to injury, it leaves these files in its own application data directory instead of trying to look for a system-designated temporary directory, meaning any other program trying to find and clean up unneeded temporary files won't notice. It's like living with the worst roommate ever. To get rid of these wastes of space, you have to dig through your file system to find Apple's directory, look for the temporary directory within that, and delete the junk yourself.

    It also always restarts the computer on Windows-based systems after applying the updates, without warning, even if it doesn't need the restart. It's annoying when you're leaving the updater in the background running and then all of your programs start to close all of a sudden.

    Oh, and let's not forget the increasingly common issue where, for no discernable reason, any TV show episodes you bought from the iTunes store will suddenly stop playing at all, rendering just an audio free black screen, forcing a complete reinstall of iTunes and Quicktime just to get your TV shows watchable again.
  • For some reason, when Apple was releasing Safari for Windows for the first time, it had a tendency to crash when attempting to bookmark a page. A basic web browser action, and it killed the program.
  • It should be noted that in a hacker's convention contest, Pwn2Own, Mac OS X was routinely the quickest to fall to an outside attack. And by quickest, less than a minute. A common entry point for exploits? Safari. Whether or not Apple is improving browser security remains to be seen.
  • The original Apple II was one of the first home computers to have color graphics, but it had its share of problems:
    • Steve Wozniak studied the design of the electronics in Al Shugart's floppy disk drive and came up with a much simpler circuit that did the same thing. But his implementation had a fatal flaw: the connector on the interface cable that connected the drive to the controller card in the computer was not polarized or keyed - it could easily be connected backwards or misaligned, which would fry the drive's electronics when the equipment was powered up. (Shugart used a different connector which could not be inserted misaligned, and if it were connected backward it wouldn't damage anything; it just wouldn't work.) Apple "solved" this problem by adding a buffer chip between the cable and the rest of the circuit, whose purpose was to act as a multi-circuit fuse which would blow if the cable were misconnected, protecting the rest of the chips in the drive.
    • The power switch on the Apple II power supply was under-rated and had a tendency to burn out after repeated use. Unlike the "fuse" chip in the disk drives (which was socketed), the power switch was not user-replacable. The recommeded "fix": leave the power switch "on" all the time and use an external power switch to turn the computer off. At least one vendor offered an external power switch module shaped to fit nicely behind the Apple II, but most users simply plugged their computer into a standard power strip and used its on/off switch to turn their equipment off.
    • The Apple Soft BASIC interpreter (which was ported from the Microsoft BASIC interpreter used on the Altair and TRS-80 computers, and others) did not implement the Boolean operators (AND, OR and NOT) properly. On most other BASIC interpreters (as well as many other "mathemtical" programming languages), a programmer could isolate individual bits in an integer using these operators, but not in Apple Soft.
  • The old Apple III was three parts stupid and one part hubris; the case was completely unventilated and the CPU didn't even have a heat sink. Apple reckoned that the entire case was aluminum, which would work just fine as a heat sink, no need to put holes in our lovely machine! This led to the overheating chips actually becoming unseated from their sockets; tech support would advise customers to lift the machine a few inches off the desktop and drop it, the idea being that the shock would re-seat the chips. It subsequently turned out that the case wasn't the only problem, since a lot of the early Apple IIIs shipped with defective power circuity that ran hotter than it was supposed to, but it helped turn what would have otherwise been an issue that affected a tiny fraction of Apple IIIs into a widespread problem. Well, at least it gave Cracked something to joke about.
    • A lesser, but still serious design problem existed with the Power Mac G4 Cube. Like the iMacs of that era, it had no cooling fan and relied on a top-mounted cooling vent to let heat out of the chassis. The problem was that the Cube had more powerful hardware crammed into a smaller space than the classic iMacs, meaning that the entirely passive cooling setup was barely enough to keep the system cool. If the vent was even slightly blocked however, then the system would rapidly overheat. Add to that the problem of the Cube's design being perfect for putting sheets of paper (or worse still, books) on top of the cooling vent, and it gets worse. Granted, this situation relied on foolishness by the user for it to occur, but it was still a silly decision to leave out a cooling fan (and one that thankfully wasn't repeated when Apple tried the same concept again with the Mac Mini).
    • Another issue related to heat is that Apple has a serious track record of not applying thermal grease appropriately in their systems. Most DIY computer builders know that a rice-grain-sized glob of thermal grease is enough. Apple pretty much caked the chips that needed it with thermal grease.
    • Heat issues are also bad for MacBook Pros. Not so much for casual users, but very much so for heavy processor load applications. Since the MBP is pretty much de rigeur for musicians (and almost as much for graphic designers and moviemakers), this is a rather annoying problem since Photoshop with a lot of images or layers or any music software with a large number of tracks WILL drive your temperature through the roof. Those who choose to game with a MBP have it even worse - World of Warcraft will start to cook your MBP within 30 minutes of playing. Especially if you have a high room temperature. The solution? Get the free software programs Temperature Monitor and SMCFanControl. Keep an eye on your temps and be very liberal with upping the fans: the only downsides to doing so are more noise, a drop in battery time, and possible fan wear: all FAR better than your main system components being fried or worn down early.
  • Apple's made a big mistake with one of their generations of the iPhone. Depending on how you held it, it could not receive signals. The iPhone 4's antenna is integrated into its outside design and is a bare, unpainted aluminum strip around its edge, with a small gap somewhere along the way. To get a good signal strength it relies on this gap being open, but if you hold the phone wrong (which "accidentally" happens to be the most comfortable way to do so, especially if you're left-handed), your palm covers that gap and, if it's in the least bit damp, shorts it, rendering the antenna completely useless. Lacquering the outside of antenna, or simply moving the air gap a bit so it doesn't get shorted by the user's hand, would've solved the problem in a breeze, but, apparently, Apple is much more concerned about its "product identity" than about its users. Apple suggested users to "hold it right". As it turns out, Apple would soon be selling modification kits for $25 a pop, for an issue that, by all standards should have been fixed for free. Apple got sued from at least 3 major sources for scam due to this.
  • Macbook disc drives are often finicky to use, sometimes not read the disc at all and getting it stuck in the drive. The presented solutions? Restarting your computer and holding down the mouse button until it ejects. And even that isn't guaranteed, sometimes the disc will jut out just enough that the solution won't register at all and pushing it in with a pair of tweezers finishes the job. To put this in perspective, technologically inferior video game consoles could do a slot-loading disc drive far better (Wii, PlayStation 3).
  • Apple Maps, which came pre-packaged in Obvious Beta form with iOS 6, had several glaring problems right out the bat. The satellite imaging technology was prone to inexcusable error, from odd patchworks of pictures taken under markedly different conditions (some of which wouldn't even be considered usable, as they were covered in clouds or very low-quality) to severely misshapen landscapes and buildings. The map function frequently misplaced entire cities, (and, on at least one occasion, continents) was outright missing countless locations, (and creating several new ones) and the search and GPS were just plain broken. It has become the topic of ridicule, with entire websites dedicated to mocking its shortcomings. Even the London Underground got in on it, with a sign reading "For the benefit of passengers using Apple iOS 6, local area maps are available from the booking office."

    A few, hearing about the problems with the new Maps app, refused to upgrade to iOS 6 until Google came out with a third-party Google Maps app for the platform (replicating the old app's functionality, if not its interface).

    That's only the start of the iPhone 5's troubles. People are reporting a purple flare when taking photos. Apple's advice? "You're just holding it wrong."
  • Tired of the problems with the Microsoft Task Manager? Apple's is even worse. It's accessed from the "Apple menu" that always appears on the left hand side of the top menu bar. But since in Mac OS the top menu bar is always controlled by the selected window, a program can crash in a way that prevents you opening that menu to close the crashed program. Oh, and also just like Windows, Finder and Dock have no special privileges so broken software can prevent you doing the most basic file manipulation.
  • Appple Music has a really nice feature. It scans your local files, deletes any music that matches what it thinks you have for what it has, and if it doesn't have it, it will upload a copy to its servers into the AAC format, regardless of the original encoding. Which means not only is Apple deleting files off your drive, if you're an independent musician or composer, Apple steals your music. All because Apple thinks if you wanted to listen to your music from the cloud, there's no point in having a local copy, right? Never minding that you need a Wi-Fi connection for this to work. Oh, and if your subscription runs out, you can't access it, even for your own actual music.

    Google's Not Feeling So Lucky 
Even Google needs to search for a clue at times.
  • In an effort to restore the tainted image of Internet Explorer, Microsoft has touted that it's more power-efficient than competing browsers. They mostly weren't taken seriously...until Google admitted to not bothering to patch a bug that's caused the Windows version of Chrome to guzzle battery life for years.

    To explain: Windows in Vista, 7, and 8 use a default timer precision of 15.6ms. Programs that are waiting on something will schedule a timer based on this precision. Programs can adjust the precision to be smaller, down to 1ms. Google Chrome did just that and kept the timer firing off even though it probably didn't need it. The side effect is that Windows can give the system less time to idle.
  • Unlike most other browsers, Google Chrome runs each tab in a separate process. As any programmer could tell you, the immediate downside of this is extraordinarily high RAM usage, as the Chrome is essentially running a duplicate version of itself for every tab you have open. The ostensible purpose of this is stability; if something goes wrong in one tab (such as Flash crashing—see its entry on this page) it won't bring down the entire session. However, anyone who tries to run Chrome alongside any other program besides maybe Windows Explorer without a near-bottomless well of RAM and gets the "tab/Google Chrome has crashed" error messages on a daily basis is probably questioning whether this design decision was worth the trade-off, or even whether it achieves its intended purpose.
  • So what do you do when you've had enough and want answers from Google Customer Service? Call them? I hope you weren't planning to do that over Google Voice, because the application is incompatible with the Google Customer Service line.
  • Android: If you for some reason have to reset your device, you have to use your Google account to unlock it. This sounds fine, if it wasn't for the fact that it forces you to use your old password and doesn't accept a more recent one.
  • Android's application rights management is a joke compared to iOS's. On Android, applications don't request rights, they demand them: The Google Play will warn you before install that the application wants to access, say, the Internet, but the only way to forbid it from doing so is not to install the app at all! You can't simply install the application in a restricted sandbox.
  • Android's power-management system makes it highly vulnerable to rogue applications. To explain it easily: by itself the system always tries to go into deep sleep, where everything is as powered-down as it can be to save battery life - unless an app tells the system "hey, please wake up, I'd like some CPU power to do stuff" using something called a wakelock. Well-made apps make sparing use of wakelocks - for instance, to check messages every few seconds by very briefly waking up the phone. But a badly-made app can request wakelocks continuously, keeping your phone awake and drawing battery power uselessly. There is no easy way to prevent this in current versions of Android. An experienced user can install all sorts of monitoring applications and try to hunt down and delete (or freeze temporarily) the accursed rogue app, but it's often a difficult and frustrating task, made worse by the fact that applications that used to behave fine can break during an upgrade and become rogue. Importantly, the hunt-down-the-power-sucker rigmarole is not a basic-level operation, and a lot of average users don't even know it exists - leading many to complain about the battery life of handsets that really should be lasting a lot longer.

    Hewlett-Packard: Inventing New Headaches 
Why does ink have to be more expensive than fine wine to realize a profit margin?
  • HP's printers, scanners, and fax machines come with software to go with the hardware. All of that software except the printer's drivers are optional, but HP does everything short of flat-out lying to suggest that the other crap is required. If you do install the optional software, the stupid thing will pop up a "Error: Your HP printer is disconnected" every time you turn off the printer.
  • Some HP computers come with batteries or power supply units are known to explode. Literally, with no exaggeration, they release sparks and smoke (and this is a "known issue"). Others overheat and burst into flames. And there have been multiple recalls, proving that they obviously didn't learn from the first one.
  • In The New Tens, HP and several other printer companies started using DRM in their printers, of all devices. These were meant to block refilled cartridges in an attempt to exploit printer ink's ridiculously high markup rates. However, they're notoriously inaccurate—which means they randomly reject new cartridges fresh out of the box, ostensibly for no reason.
    • This is because part of the system is a microprocessor on the ink tanks that will completely brick the cartridge after a certain date. That's right - ink cartridges now ship with an expiration date that the retailer sees on the shipment box, but since they're not "perishables" or food items, they're not required to display this information on their packaging.
  • The Hewlett-Packard support assistant that is equipped with many of their computers, such as the Pavilion, is very poor at automating automatic updates of included software such as Cyberlink's utilities and HP's own service software. Batch updating often results in several updates failing to install, requiring the user to manually download the failed updates their-self.
  • In an effort to better suit beginning users, Hewlett-Packard includes bundles of software on their pre-configured computers (including the infamous Norton Security software), that adds functionality like CD & DVD authoring tools, some Wild Tangent game trials, and other stuff. Unfortunately, if you're a power user who wants to only install your preferred software, this software just gets in the way, and can potentially interfere in the smooth operation of Windows. Worse, if you took Microsoft's free offer to upgrade to Windows 10, the upgrade may not go as expected, leaving a barely-usable system with slow-downs and even fatal crashes. Fortunately, if your system is a disaster, you can use a software tool (such as "Produkey") to copy down your installation-key, download a Windows Media Creator to make a boot-able installation DVD and install a clean copy of Windows 10 without the extra bloat on the hard drive and processor.
  • Some of HP's printers come with a feature called HP Smart Install. Ostensibly, this is supposed to make the printer quick and easy to setup by utilizing an autorun program to install its drivers. However, most computers will simply detect the printer as a CD drive when plugged in, and even if the setup program is run manually it will fail halfway through. In fact, going into the printer's menu (if it has a physical control panel) and disabling Smart Install is the better option, as the computer will search for the drivers online and install them itself with the added bonus of not installing the extra, unnecessary software included with the Smart Install package.

    Intel - Experience What's Inside, Including the Pain 
Being a leader in processor development doesn't mean everything worked out the first time.
  • The "Prescott" core Pentium 4 has a reputation for being pretty much the worst CPU design in history. It had some design trade-offs which lessened the processor's performance-per-clock over the original Pentium 4 design, but theoretically allowed the Prescott to run at much higher clockspeeds. Unfortunately, these changes also made the Prescott vastly hotter than the original design, making it impossible for Intel to actually achieve the clockspeeds they wanted. Moreover, they totally bottlenecked the processor's performance, meaning that Intel's usual performance increasing tricks (more cache and faster system buses) did nothing to help. By the time Intel came up with a new processor that put them back in the lead, the once hugely valuable "Pentium" brand had been rendered utterly worthless by the whole Prescott fiasco, and the new processor was instead called the Core 2. The Pentium name is still in use, but is applied to the mid-end processors that Intel puts out for cheap-ish computers, somewhere in between the low-end Celerons and the high-end Core line.
  • The Prescott probably deserves the title of worst x86 CPU design ever (although there might be a case for the 80286) but allow us to introduce you to Intel's other CPU project in the same era: the Itanium. Designed for servers, using a bunch of incredibly cutting-edge hardware design ideas. Promised to be incredibly fast. The catch? It could only hit that theoretical speed promise if the compiler generated perfectly optimized machine code for it. Turned out you couldn't optimize most of the code that runs on servers that hard, because programming languages suck note , and even if you could, the compilers of the time weren't up to it. Turned out if you didn't give the thing perfectly optimized code, it ran about half as fast as the Pentium 4 and sucked down twice as much electricity doing it. Did we mention this was right about the time server farm operators started getting serious about cutting their electricity and HVAC bills?
    • Making things worse, this was actually Intel's third attempt at implementing such a design. The failure of their first effort, the iAPX-432 was somewhat forgivable, given that it wasn't really possible to achieve what Intel wanted on the manufacturing processes available in the early eighties. What really should have taught them the folly of their ways came later in the decade with the i860, a much better implementation of what they had tried to achieve with the iAPX-432... which still happened to be both slower and vastly more expensive than not only the 80386 (bear in mind Intel released the 80486 a few months before the i860) but also the i960, a much simpler and cheaper design which subsequently became the Ensemble Darkhorse of Intel, and is still used today in certain roles.
    • To be fair, in the relatively few situations where it gets the chance to shine, Itanium 2 and its successors can achieve some truly awesome performance figures. The first Itanium on the other hand was an absolute joke. Even if you managed to get all your codepaths and data flows absolutely optimal, the chip would only perform as well as a similarly clocked Pentium III. Even Intel actually went so far as to recommend that only software developers should even think about buying systems based on the first Itanium, and that everyone else should wait for Itanium 2, which probably ranks as one of the most humiliating moments in the company's history.
      • The failure of the first Itanium was largely down to the horrible cache system that Intel designed for it. While the L1 and L2 caches were both reasonably fast (though the L2 cache was a little on the small side), the L3 cache used the same off-chip cache system designed three years previously for the original Pentium II Xeon. By the time the Itanium had hit the streets however, running external cache chips at CPU speeds just wasn't possible anymore without some compromise, so Intel decided to give them extremely high latency. This proved to be an absolutely disastrous design choice, and basically negated the effects of the cache. Moreover, Itanium instruction are four times larger than x86 ones, leaving the chip strangled between its useless L3 cache, and L1 and L2 caches that weren't big or fast enough to compensate. Most of the improvement in Itanium 2 came from Intel simply making the L1 and L2 caches similar sizes but much faster, and incorporating the L3 cache into the CPU die.
  • While Intel's CPU designers have mostly been able to avoid any crippling hardware-level bugs since the infamous FDIV bug in 1993 (say what you like about the Pentium 4, at least it could add numbers together correctly), their chipset designers seem much more prone to making screw-ups:
    • Firstly there was the optional Memory Translator Hub (MTH) component of the 820 chipset, which was supposed to allow the usage of more reasonably priced SDRAM instead of the uber-expensive RDRAM that the baseline 820 was only compatible with. Unfortunately the MTH basically didn't work at all in this role (causing abysmally poor performance and system instability) and was rapidly discontinued, eventually forcing Intel to create the completely new 815 chipset to provide a more reasonable alternative for consumers.
    • Then there was the 915 and 925 chipsets; both had serious design flaws in their first production run, which required a respin to correct, and ultimately forced Intel to ditch the versions they had planned with wi-fi chips integrated into the chipset itself.
    • The P67 and H67 chipsets were found to have a design error that supplied too much power to the SATA 3Gbps controllers, which would cause them to burn out over time (though the 6Gbps controllers were unaffected, oddly enough).
    • The high-end X79 chipset was planned to have a ton of storage features available, such as up to a dozen Serial SCSI ports along with a dedicated secondary DMI link for storage functions... only for it to turn out that none of the said features actually worked, meaning that it ended up being released with less features than its consumer counterparts.
    • A less severe problem afflicts the initial runs of the Z87 and H87 chipsets, in which USB 3.0 devices can fail to wake up when the system comes out of standby, and have to be physically disconnected and reconnected for the system to pick them up again.
    • Speaking of the 820 chipset, anyone remember RDRAM? It was touted by Intel and Rambus as a high performance RAM for the Pentium III to be used in conjuntion with the 820. But implementation-wise, it was not up to snuff (in fact benchmarks revealed that applications ran slower with RDRAM than with the older SDRAM!), not to mention very expensive, and third party chipset makers (such as SiS, who gained some fame during this era) went to cheaper DDR RAM instead (and begrudgingly, so did Intel, leaving Rambus with egg on their faces), which ultimately became the de facto industry standard. RDRAM still found use in other applications though (like the Nintendo 64 and the Playstation 2).
      • A small explanation of what happened: Rambus RDRAM memory is more serial in nature than more-traditional memory like SDRAM (which is parallel). The idea was that RDRAM could use a high clock rate to compensate for the narrow bit width (RDRAM also used a neat innovation: dual data rate, using both halves of the clock signal to send data; however, two could play that game, and DDR SDRAM soon followed). But there was two problems. First, all this conversion required additional complex (and patented) hardware which raised the cost. Second, and more critically, this kind of electrical maneuvering involves conversions and so on, which adds latency...and memory is one of the areas where latency is a key metric: the lower the better. SDRAM, for all its faults, operated more on a Keep It Simple Stupid principle, and it worked, and later versions of the technology introduced necessarily complexities at a gradual pace (such as the DDR2/3 preference for matched pairs/trios of modules): making them more tolerable.
  • Intel's SSDs have a particular failure mode that rubs people the wrong way. After a certain amount of writes, the drive goes into read only mode. This is great and all until you consider that the number of writes is often lower compared to other SSDs of similar technology (up to about 500 terabytes versus 5 petabytes) and that you only have one chance to read the data off the drive. If you reboot, the drive then goes to an unusable state, regardless of whether or not the data on the drive is still good.

    Microsoft Doesn't Take You Where You Want to Go Today 
With great market share comes great exposure to malicious coders. Keep those virus definitions updated, and Windows Updates downloading.
  • Microsoft in general tends to have a lot of problems with Not Invented Here and an inability to let go of problematic legacy code and designs (often because lots of third-party software relies on the problematic behavior) in their own unsung examples of Idiot Programming.
  • In fact, similar to The Daily WTF, Microsoft veteran Raymond Chen's blog The Old New Thing is a good place to look for explanations of things that at first seem to be Idiot Programming on Microsoft's part. As Chen puts it, "no matter what you do, somebody will call you an idiot".
    • As of October 2010, this is a list of programs that can be described as "we found this program doing stupid shit and have to work around it". One of the reasons Vista was so poorly received was because a lot of programs that did stuff they shouldn't have done wouldn't work properly. "Properly" being based on guidelines formed around 2001 and enforced in 2007.
  • Mac Word 6. So legendarily bad, the Windows version ran faster in an emulator. Bear in mind that this was back when PCs ran x86 code and Macs were on the 68k architecture. Exacerbated by the quality of its immediate predecessor, Mac Word 5.1, often regarded even today as Microsoft's finest work and possibly the best word processor ever written. The troubles of Word 6 were mainly due to an attempt to make it universally coded (i.e., both Mac and PC-friendly), but the result was a slow, memory-intensive clunker.
  • Versions of Word as recently as 2003 have had a document filesize limit of 32 megabytes, even though that could be reached by a document with 30 reasonably-sized photos embedded in it.
  • Despite its beautiful and fairly responsive user interface, Windows Live Mail has several glaring flaws, such as pegging the CPU for a full minute to fetch mail from an IMAP server and popping an error message every time it is disconnected, even though it should be a Foregone Conclusion if you leave a dormant connection lying open for several minutes with no activity.
  • Every version of Microsoft Windows gets this when it first comes out (except, strangely, for Windows 7), but Windows Vista and Windows ME have had the highest amounts of backlash. The common belief now is that most of Windows Vista's bad reception came from a sub-optimal initial release, which had a number of serious bugs relating to file transfers and networking (they mostly caused speed problems rather than data corruption ones, but it made using pre-SP1 versions of Vista a pain in the backside). Most of the serious problems were fixed with the first service release, but Vista's reputation, which had already been dented by its failure to live up to Microsoft's early promises, never really recovered. Even after applying service packs, Vista is still very noticeably dog-slow compared with Windows 7/8/10, especially on budget Pentium D systems and comparable AMD chipsets.
    • Windows ME, on the other hand, was arguably the worst operating system (apart from the infamously broken MS-DOS 4.00) ever released by Microsoft, to the extent that geeks have been wondering for years whether it was some kind of Springtime for Hitler plot to make the upcoming NT-based "Whistler" (what would subsequently become Windows XP) look better. Perhaps the biggest problem was that the memory management system was so catastrophically broken that up to a third of any given system's RAM was often rendered unusable due to memory leaks. Moreover, System Restore (which would become a well-loved feature in XP and beyond) severely hurt performance, not helped by the aforementioned memory management problem, and would quite often fail to restore your documents and important files, but did restore viruses and malware.
      • A particularly facepalm worthy bug: ME, for the first time, supported Zip files without an external program. This was back in an era when diskettes were still ubiquitous, so the use case of a Zip file spanned across multiple diskettes was not a particularly uncommon situation. Opening a spanned archive would result in a prompt for the first diskette... and it would keep asking you until you produced that diskette. If it was lost, reformatted or damaged you were out of luck because there was no way to cancel out of that dialog box and no way to terminate it without terminating Explorer.
      • Another stunning Windows ME design decision that seems brilliant on paper but turned out to be terrible in execution: Originally in Windows 3.x and 9x, if a program was stalling, you could press CTRL+ALT+DEL to interrupt it, and pressing the buttons again would force a reboot. In Windows ME, CTRL+ALT+DEL became the key combination to open up the Task Manager instead, but pressing the buttons again would still force a reboot. However, since this is ME we're talking about, there was absolutely no way to tell if the computer was majorly lagging or if it didn't read the input at all. And, of course, if the system is lagging, it still keeps track of which buttons you've pressed. So, naturally, when you push CTRL+ALT+DEL to open up the Task Manager just to shut down a program that isn't working right, the Task Manager doesn't appear. Wait thirty seconds. Still nothing, so you push CTRL+ALT+DEL again. After another ten seconds, it finally pops up. You go to close the program, and then...."Windows is shutting down."
      • ME's biggest problem was that it supported two driver types: the new, NT-style dlls we're all comfortable with today, and the old VxDs that originated way back in the days of Windows 2.1. The idea was to phase out the old drivers while maintaining backward compatibility; alas, what actually happened is that the two driver types didn't play nice with each other. As a result, if you needed a combination of the two driver types for your computer (which almost everybody did) ME crashed often and with great enthusiasm. However, if you were one of the lucky few whose hardware only needed one driver type, WinME actually worked kind of decently - hence the occasional user who can't quite figure out why everybody hated ME when it worked so well for them.
    • Windows XP was pretty decent in most aspects when it was released...except for the OS's security, which was broken beyond belief, even if it wasn't obvious at the time of release. This owed to an attempt to make it backwards-compatible with the past four OSes. Famously, it was demonstrated that if you installed the RTM build of XP on a computer in mid-2007 and browsed the internet for just an hour, the OS would be hopelessly corrupted by viruses and malware, to the point of requiring a complete reformat and reinstall of the system.
      • This was exacerbated by the decision to assign the main user account administrative privileges, opening up many ways to potentially ruin the operating system. This helped the operating system to behave more like Windows 98 for compatibility; a very risky choice, leaving an uncontested back-door open to attacks, that required patch after patch to remedy.
    • Vista was particularly hilarious in the way it restructured so many things that Microsoft actually had to set up workshops to teach people how to use it; customers found these workshops very helpful. Snarkers were quick to pick up on the fact that Vista was perfectly intuitive, provided you had a trained expert holding your hand every step of the way.
    • A major annoyance for new Vista/7 users who migrate from XP: the Read Only bug. Any hard disk with a NTFS file system that was created in XP that gets imported into a Vista/7 system will by default have all files and folders stored in it as read only, even for a user with administrative privileges, and even if one uses the "take ownership" feature. The solution would be go to the Security properties of each and every file and folder (the fastest way would be to go to the file system's root directory and select all files, and apply the following steps to all child files and folders), add the current user account to the list, declare it the owner, and grant all privilges.
    • Windows Update may sometimes screw up the bootloader on the hard drive, necessitating a reinstall of sorts to fix this. Particularly annoying if the Windows install didn't install the recovery environment and thus, a way to fix the computer. Also, the occasional Windows update seems to not regard the drive letter of the Windows installation they're being installed on, instead just selecting the earliest drive letter with a windows installation on it, or some other arbitrary means. This is evidenced by trying the update, having it fail, turning off the computer and physically disconnecting the other drive, trying again, and the update works.
    • Users that updated to Windows 8.1 via the Microsoft Store noticed one crucial feature about Windows 8 that was really nice to have: Refresh. This way you can effectively "reinstall" Windows without having an install disk. The problem? It actually relied on a special file (presumably an ISO copy of the install disk) and the 8.1 update failed to include this, making it impossible to use the Refresh command. Whoops. Though some people have figured out a way around this.
    • Windows Vista introduced the ProgramDatanote  folder and encouraged application developers to store system data there, with adding content in many applications simply involving dragging and dropping new files in to read. Cue Windows 8.1 making the directory read-only and disabling the ability to remove the read-only flag...the convoluted process of how Windows manages file permissions, or that it always marks system folders as read-only doesn't help.
    • Windows Update on many machines (from Vista through 8.1) may sometimes install a driver update for Ethernet and Wi-fi adapters. This would not be significantly problematic, save that the drivers installed are produced by Microsoft instead of Atheros or the actual hardware producer, and that after installation, the affiliated network connections do not properly resolve DHCP. DHCP, or the Dynamic Host Configuration Protocol, is what allows computers to obtain an IP address on their local network, determine how to route connections, and resolve connections through DNS, meaning that said computers are no longer able to resolve 'google.com' as, say, Have fun replacing your drivers without a System Restore!
    • A notorious problem is that Task Manager, the interface primarily used to halt crashed programs, became a standard program itself from Windows Vista onward. This means that it can be impossible to start Task Manager if the crashed program is "spin locking" (constantly using the CPU without achieving anything) because Task Manager has no special claim on the CPU compared to the crashed program. Task Manager also tries to place itself on top of other windows, but again has no special claim on this, so can be forced to the back by full screen or aggressive programs - an exploit frequently used by malware. Ironic because in Windows 3.1, Task Manager took priority over absolutely everything, although this may have been because of the much simpler multitasking architecture and lack of DMA.
      • Likewise, Windows Explorer (the software that provides the desktop interface) is considered just another program, not part of the operating system. This means that even fundamental tasks like starting programs and moving and copying files can be disrupted by other programs.
  • The Zune software. The interface is fine, but at first, it devoured RAM and took up way too much CPU power for what it does. But each subsequent release managed to improve performance, to the point where as of version 4 even machines that don't have much higher than the minimum spec can run it with all the visual effects turned on with little to no problem. iTunes, on the other hand, started at bloated garbage and got even more slow and bloated over time.
    • The Zune player itself, on the other side, had a leap-year glitch, which made them freeze up on New Year's Eve of a leap year because of the clock driver screwing up on how it handles leap years.
    • The Zune was also incompatible with PlaysForSure-protected media. Microsoft apparently can't even maintain compatibility with its own stuff.
  • Windows Live Hotmail. Until late 2010, despite being entirely capable of handling everything the application did, Opera and Chrome had to spoof as something else before Microsoft would allow access. They also had to supply their own scripts, because the ones served by the site itself were broken.
    • In 2011, they've fallen prey to yet another issue. They're attempting to fight spam… by preventing it from leaving the user's draft box. Perfectly legitimate mail is often blocked, which they acknowledge, giving no clue for how to change it so you can send your message beyond "re-edit it so it looks less spam-like." "Spam-like" has, among other definitions, content-free subject lines such as "RE: How's it going?".
  • Active Desktop was an optional Windows 95 update released in '97 in an effort to catch up with that "World Wide Web" thing that had taken Microsoft by surprise and capitalize on the new push technology hype (basically RSS feeds). The concept was ahead of its time: you could place webpages and things like weather and stock updates right on your desktop and bypass the browser altogether. It also gave your folders a spiffy overhaul, introduced the quick launch bar and made everything clickable look like hyperlinks. In fact, folder windows were browser windows and you could type both URLs and folder paths into the address bar. There was one problem (aside from the need to be constantly connected over pay-per-minute dialup to receive push updates): many user interface elements were essentially outsourced to your browser, and this was back when a crash in one browser window tended to take down all others with it. The browser was the paragon of instability known as Internet Explorer 4. You can see where this is going.
    • Things got more sensible and less crash-prone in Windows 98, but the desktop components remained unstable all the way until Microsoft realized no one used the feature for exactly this reason and replaced it with desktop gadgets in Windows Vista.
  • Microsoft Outlook uses one giant .pst blob for all emails, which tends to get corrupted once it reaches two gigabytes. This page acknowledges this, and implies that it's the user's fault for exceeding the size limitation, since users have nothing better to worry about in their lives. Note how the error doesn't come up until after it's a potential problem, and the fix simply truncates the .pst to 2GB, destroying the messages that don't make the cut.
    • Prior to Version 7, Exchange did the much the same thing: all mail for its users was stored in a single flat file on the Exchange server. This file was generally created in its entirety in advance and populated over time rather than constantly expanding. Problems: if the file became "fragmented" as users deleted messages, it would require a compression cycle, which required the server be taken offline for possibly hours. Additionally, if the file reached its limit, it would simply stop accepting new messages while acting to users like nothing was wrong. The file could be increased in size, but only to about 16GB (as of Exchange 5). The safest solution was to migrate to Exchange 7, which in and of itself is a nightmare that often required rebuilding the entire system to deal with the absolute requirement of Active Directory.
    • Outlook 2007 had a very bizarre problem where merely having the font Helvetica (yes, the one that font-snobs will say is the BEST FONT EVER) installed on the PC would cause Outlook to crash whenever opening a new e-mail.
    • Outlook 2013. Want to see your old messages in the inbox? Well, you can't! You have to click a damn link at the bottom of the screen "to see more". You just changed folders and want to return to the inbox? Guess what, you need to click on the damnable link again! There are no options to display all your messages in your inbox.
    • You've inserted a joint file and have opened it up to make sure it's good before sending. Outlook will warn you that you're about to alter the file and ask you if you want to continue or not. If you keep saying no, it will not stop harassing you.
  • Internet Explorer:
    • Internet Explorer 6...note  hoo boy! At the time of its release it was widely seen as being a decent enough browser, though a lot of people complained that it hadn't really changed that much since version 4. What became obvious as the years progressed — something made all the worse by Microsoft's decision to cease development on Internet Explorer and only upgrade it when what would eventually become Windows Vista was released; while they eventually backtracked on this policy and released it for older versions of Windows, the eventual Internet Explorer 7 wouldn't come until near the end of 2006, five years after IE6's introduction — was that it suffered from unbelievably poor security, to the point where new exploits were being found literally every day by the hacker community by early 2004. This, combined with Microsoft's stubborn refusal to do anything more than patch the most serious bugs, effectively restarted the Browser Wars which had largely been over since 1998, and saw Microsoft's >90% share of the browser market get demolished by Chrome, Firefox and the other browsers, to the point where only around 20% of PC users use Internet Explorer as their primary browser today (in comparison, over 40% of PC users use Chrome as their primary browser).

      The transition from IE6 to newer browsers wasn't helped by IE6's support for HTML standards somehow being even worse than its security, requiring massive re-coding of websites. In fact, some had to maintain separate IE6-compatible versions for several years after version 7 was released. On top of all of this, a ton of corporate intranet applications were designed with IE6 in mind and simply didn't work at all on any other browser, forcing many big companies to stick with the browser (and, by extension, Windows XP) even to this day, thus still posing a massive security risk. There are several more things we could talk about, but the bottom line is, IE6 is often regarded as being not just Microsoft's worst product, but arguably one of the worst tech products of all time.
    • Internet Explorer 7 and 8 both talked up a "new commitment to standards and performance", with each one certifiably supporting more features than its predecessor, but each paling in comparison to every other browser available when released. IE7 did fix some of the most severe bugs that IE6 had suffered from, but the underlying engine was near-identical with most of the new features being cosmetic, and for the most part the browser was just as insecure and bug-ridden as its predecessor. IE8 by comparison had a redesigned engine that fixed most of the security problems, but added a new problem in that it kinda sucked at rendering older websites. Microsoft tried to divert attention from this by hyping up its "Web Slices" and "Accelerators", both of which were features that only Internet Explorer supported, but all the other browsers could feel free to implement themselves! While this trick worked for Netscape during the first Browser War (until Microsoft ended it by fiat by bundling IE4 with Windows 98), it didn't take this time around, and both versions languished in obscurity, hemorrhaging market share all the while.
    • Internet Explorer 9, however, looks like Microsoft has learned their lesson in what they have to do, and that it's going to finally avert this, with development focusing exclusively on W3C-standardized features (as in HTML5, CSS3, and EcmaScript 5), many that every other browser already supports, and some that they don't- but only ones that are part of the World Wide Web Consortium-approved standards. Of the tests that Microsoft has submitted that IE9 passes but other browsers fail, most are passed by at least one other browser, with some tests they've submitted not even passing in Internet Explorer 9, but passing in Opera or Safari.
      • Many legacy features are finally being re-architected to match the reality of modern Web browsing, such as JavaScript being implemented as part of the browser instead of going through the generic Windows COM script interface that was introduced over a decade ago and used through IE8.
      • To top it off, the IE9 Platform Previews run completely platform-agnostic examples an order of magnitude faster than every other browser out there (by implementing hardware-accelerated video through Windows' new Direct2D API).
    • From a plugin standpoint, get a load of these load times. The AVG toolbar adds, on average, a full second to the load time every time you create a new tab. On top of lots of other boneheaded on-load hooks, one developer actually incorporated network calls to their addon's initialization routine, meaning that, until it received a response from a remote server, it would block your tab from opening.
    • Not that the later, stabler versions of Internet Explorer don't have their share of irritating issues. IE has the ability to reopen your last browsing session after a crash. Seems fair enough... until you realize that this will not only reopen the tabs and windows you had open but anywhere from five to over thirty additional windows open to your home page, for no apparent reason.
      • It usually keeps track of any browser window that crashes, and opens all of them the next time you're prompted. Normally, this isn't a problem, but if you open IE by clicking a link (or double-clicking an internet shortcut on your desktop/start menu) while it's your default browser, you won't be prompted to re-open your last session. If this one crashes, too, it's added to the record. You can see where this is going. Open 20 browser windows like that and crash them, and then when you finally open one by just double-clicking the IE icon, and choose to restore your last session, and it brings all 20 back. A good practice, taken much too far.
  • Microsoft Office 2007 has some neat features that were previously unavailable, but those features take a backseat to some of the problems it has:
    • The program is a RAM hog, making it cripplingly slow, even on machines with 4GB of RAM.
    • The toolbars are nowhere near as customizable as previous versions, leaving you with the "ribbon" at the top, which takes up a good portion of your screen.
    • Many of the shortcuts have been eliminated. If you're the kind of person who likes to use a keyboard instead of the mouse, you're out of luck.
    • Many of the features have been renamed, but the Help feature doesn't help you with this at all. It would have been nice to go to the help menu, type in the name of the feature you want to use, and have it give you the name of the new feature. If you had a function that you used in a previous version, you have to figure out what the new version calls it.
    • Many features have been shuffled around, too. Using Microsoft's unusual naming and categorizing strategy, you have to figure out where your features went, which is especially difficult if you aren't sure what the new version calls it, or if it was removed entirely.
      • For example, in Office 2003 and previous versions, if you wanted to edit the header or footer, you would go to edit, then header/footer. In Office 2007, if you want to edit the header or footer, you have to go to "Insert" then "header/footer."
    • Like previous versions, if you do a lot of typing in Word, you're going to spend most of your time looking at the bottom of the screen. The only way to avoid this is to continuously scroll up.
  • Microsoft's own spellchecker doesn't recognize Microsoft's own words, such as PowerPoint unless you syntax it right.
  • Games for Windows Live was Microsoft's attempt to take on Steam, and given Microsoft's sheer resources, many thought they would have some success in this. To put it lightly, however...they didn't. Poor design choices all around meant that it never attracted many users, and was eventually discontinued in August 2013, to be replaced by an integrated app store in Windows 8 — and for all that OS's faults, the app store sensibly decided to target casual games, a market which Steam and the others haven't exploited as much. As for why the GFWL marketplace was discontinued:
    • To install a game, you had to have enough room to store three entire copies of that game on your hard drivenote . For some games, that's well over 30 gigabytes (and by the end of the store's life, could reach 100 gigabytes). Contrast with Steam, which requires enough room to store one entire copy. You know, the copy that you actually use.
    • Also, while Steam will automatically update your games for you so that you never have to worry about not having your game up to date, Games for Windows Live would only tell you it needed to update a game when you actually tried to play it. You know, the exact moment when you don't want to wait several minutes for your game to be ready.
      • Not to mention that it's a crapshoot over whether you can actually get that update - DLC content can be denied to a player for no reason whatsoever. Red Faction: Guerrilla's multiplayer mode was rendered entirely unplayable because GFWL basically decided that faking an update and then slowing the game to a crawl rather than finding whatever it was looking for sounded like jolly-good fun.
    • Another terrible aspect of GFWL is that it a lot of games using it have their save games locked down in a way that makes you essentially lose them every time you reinstall a game or try to transfer your progress to another system. Again this compares unfavorably to Steam, which either just keeps out of the way of screwing with save-files in the first place, or, with Steam Cloud, outright embraces transferring them to different systems or installations.
    • Also, if you somehow registered for the wrong country, there was no way to go back and change it. At all, not even with customer service. The only solution was to create another account with a different name (and lose everything in the previous one, of course).
    • It's so bad that Microsoft blacklisted its own program in Windows 8.1. If you attempt to install GFWL, it will ask "This program has compatibility issues, do you want to run it?"
  • Versions of Windows Media Player randomly crash with a cryptic message about "server execution" failing. Even when you're trying to play a simple .wav file on your computer with no DRM and network sharing disabled, and there's no conceivable reason it would even need to access a remote server in the first place.
    • Also, Windows Media Player started as a very simplistic, lean-and-clean little software that did its basic job (of playing multimedia files), and did it decently well. Then at some point each new version of it added more and more eyecandy to it, increasing its resource consumption and decreasing its usability, up to a point where it became almost unusable. (This is the reason why Media Player Classic was made: To give people the good old WMP without the cruft.) The Windows 7 version of WMP backed up a bit on the eyecandy and cruft and is at least barely usable.
  • Absolutely any Windows installer that ignores the default install directory in the registry and tries to put the app in C:\Program Files regardless belongs on this page, but especially if:
    1. The documentation advises against putting the app in Program Files (because, say, it tries to save its settings in its own directory, multiple users be damned).
    2. The installer tries to put the app in Program Files even when you tell it to put it somewhere else.
    3. The installer refuses to work at all if you don't have a C: drive.
    4. The program is 32-bit, and puts itself in the Program Files folder instead of Program Files (x86) on a 64-bit computer.
    5. Perhaps most infuriatingly, telling the program to install somewhere else will cause it to install some of itself in the specified location, and the rest in "Program Files" anyway. Bonus points if it only makes a token effort, putting a couple megabytes where you told it and hundreds in the default location.
  • With the development of Windows 10, Microsoft released an update for older operating systems that does nothing except pester users to get the new OS. This would merely be annoying if it didn't cause the occasional error, slow down boot time, and use lag-inducing amounts of processing power. All for the purpose of informing users of Windows 7 and 8 about an OS that hasn't even been released yet.
    • After removing and hiding the update that caused those pop-ups with the Windows Update, it unhid itself later so it can be re-installed.
    • In its relentless drive to get users to upgrade, Microsoft offered Windows 10 for free to users of Windows 7 and 8 as an optional update through Windows Update. Many users have Windows Update set to "download optional updates and ask before installing". Cue *four gigabytes* being delivered to users in countries with low data caps or on netbook type devices with minimal onboard storage. Regular Windows updates are several orders of magnitude smaller.
    • Windows 10 installs updates by default, leaving home users with no way to defer or refuse defective updates. Before launch, Microsoft claimed they would perform rigorous testing of each new updates through their fast track program. Long story short, the first update that left users with an unusable computer occured before launch, as the beta phase was winding down.
    • If you're on a previous version of Windows, notice it installing the "upgrade to Windows 10!" nag screens, go "hell no you don't" and delete the applet and registry keys (probably using GWX Control Panel) after it's downloaded and started the preinstall but before rebooting and letting it finish, the system will become broken beyond rescue and will need a full reinstall.
  • If you thought GUI programming in general was bad (see below), well, Microsoft has you covered. Windows has five separate GUI toolkits (Win32, MFC, Forms, WPF and Metro). Win32 is the one all others are implemented on top of, but its (never used nowadays) default settings produce UI that look like the ones in Windows 3.1. MFC is a C++ library on top of it, but was designed before C++ was fully standardized and comprises numerous ugly hacks such as DIY exception management and classes that simulate diamond inheritance. Windows Forms is the much cleaner .Net-based equivalent and WPF is supposed to be the revolutionary new interface for the Windows desktop, but Microsoft roughly alternates between apparently abandoning each one. And then comes Metro, better known as "who put a smartphone UI in my desktop?".
    • And the real killer with this is that because Forms and WPF are designed around C# and the .NET CLR, there is no up-to-date native code UI toolkit for Windows. This has resulted in a proliferation of third-party toolkits (all implemented atop Win32).... which in turn ties down MS to keeping those old toolkits around, because there's so much software using old third-party toolkits that require the old interfaces to be maintained bug-for-bug.
    • It's also kind of bad if you want to program in Ruby. Or Python. Or Factor. Or any language whose low-level libraries are written in C, which is practically all of them. Because of the awkwardness of the Windows native UI libraries, they tend to use cross-platform UNIX-based ones; which often have very badly maintained Windows ports, and again produce software that looks more than 10 years old without extensive retooling. (Java is the one exception.)
    • Did we mention that all the work to transform the programmer's use of MFC/Forms/WPF etc into Win32 is done by your computer and your CPU, every time any program you ever run under Windows does anything?
  • With Windows 10 now available, tech magazines said good things of it, and Microsoft has made effort to win back the Microsoft Windows 7 demographic. However, AMD A-Series users may have upgraded, only to find that their system (notably pre-configured ones, like Hewlett Packard Pavilion Laptops) is now very unstable, with Explorer suffering a lot, causing users to either revert to Windows 8/7, restore a backup image to an earlier time, or (if they copied down their installation-key from a key-viewing program) burn a re-installation disc and attempt a Microsoft-sanctioned install.

    The nice part about a clean install is it frees your computer of the crapware and outdated drivers that may be gimping your computer; this can let the computer function better than the OEM version (More on that problem in the Hewlett Packard section). Why Microsoft doesn't prevent unstable drivers or programs from transferring to Windows 10 is another puzzler. Microsoft has also been pleading with computer manufacturers to stop with this practice of weighting down their operating system with "crapware", due to the blame spilling over to their company.

    While not a total system-breaker, Windows 10 Explorer still has some stray bugs here and there. The Explorer, which includes the Start menu interface and system Explorer proper, was once prone to seizing up temporarily and eventually coming free, or being restarted by Windows if it had hung for a certain period of time. This even occurred on a reasonably new computer with a fresh install of Windows 10, which is puzzling to say the least. This was annoying if one just came from Windows 8.1; the Start interface was radically different from Windows 7, but was noticeably less likely to trip up and restart.

    Thankfully, the November 2015 Service Pack for Windows 10 solved many of the bugs that the initial release had, and it appears Windows 10 is on its way to being just as stable as Windows 8.1 was. The upgrade even helps by resetting some of the configurations and apps (without totally resetting the system) for a like-new feel.

    However, the System Restore feature still needs help. If you ever need to roll back the system, there is no indication that the available restore points are usable until you attempt to use it. Maybe it is anti-virus software misbehaving, but it can be like flipping a coin to see if your restore point works. Chances are, you've tried rolling back the system only for the system to reboot and Windows to notify that your system has not been changed and the restore point failed. You may as well just ignore System Restore and just use Control Panel\System and Security\Backup and Restore (Windows 7) to save a drive image to a removable HDD/flash-stick and create an emergency boot DVD. Use this combo for when your computer goes insane; there is too much that can go wrong with System Restore.
  • As of version 7.18, Skype has a tendency to crash whenever the user tries to copy something out of a chat window, or even when they try to browse through folders! (Windows Explorer being a completely separate process which shares no resources whatsoever with Skype, this last one is particularly inexcusable.) It's never a good sign when the official solution for a problem is "switch to an older version."

    Sony Only Does Everything Wrong 
  • As discovered in the months when the PS3 was hacked, some of the code in the system involving signing software was discovered to use a constant integer for all systems, which was barely obfuscated at all. (Some simple algebra was all it took to get it) Click here for a full explanation 
  • The Sony rootkit designed to install whenever a user placed an audio CD in their computer. Ironically, this wouldn't get installed on many user's computers because it required administrative privileges to install, and a safe setup will deny these privileges to prevent just this kind of software from installing. On top of that, the rootkit installed on AutoPlay, which means (on Windows XP and earlier from before AutoPlay was changed to be prompt-only) you could defeat it by, on top of disabling AutoPlay altogether, holding the shift key when you insert the disc. If the rootkit did manage to get itself installed on a paying customer's computer, it would slow down the CPU AND open up gigantic security holes that would invite (additional) malware.

    Sony later released a program that would supposedly remove the rootkit, but only installed more crap. And to download it required submitting a valid e-mail address, which Sony was free to sell to spammers. It's perhaps the greatest example of an anti-piracy action that does nothing to discourage piracy, and, indeed, punishes people for a legitimate purchase. All this in the name of Copy Protection. Because Digital Piracy Is Evil, and committing destruction of property on computers is apparently better than possibly letting them copy a CD they've legally bought.
  • Another idiotic Sony DRM idea: Key2Audio, a DRM system that worked by violating the Red Book Compact Disc standard and putting a dummy track claiming the disc was empty around the outer edge of the disc (which is read first by PC disc drives, while stereos read from the inner track first). The trick to breaking this one? Keep the outer track from being read. How to do that? Draw over the edge with a permanent marker.
  • Still Sony: the Playstation 3. A firmware bug in which some models believed that 2010 was a leap year resulted in lockouts on single player games due to the machine refusing to connect to the PlayStation Network. What was the reason for this system having such a perilous dependency on the judgement of its system clock? DRM!
    • The bug stemmed from the hardware using binary-coded decimal for the clock. Because apparently converting that time for display is so difficult for the eight core Cell processor.
  • And another Sony facepalm to add to the pile: Sony releases frequent updates for their PlayStation Portable, mostly in an attempt to fix "security holes" that would allow running "homebrew" applications in the name of preventing piracy. On one occasion a fix for an exploit that would allow such "unauthorized" code to run with the use of a specific game ended up opening an exploit that required no game at all.
    • Sony tried to do the same with the Playstation 3, in addition to numerous other security features such as the Hypervisor and the Cell Processor's SPE Isolation. As the hacking group Fail0verflow (the same guys responsible for the major Wii breakthrough) discovered, the only bits of security that are actually implemented well are usermode/kernelmode, per-console keys, and the "on-die boot ROM" - everything else was either bypassed or broken through. This includes the public-key cryptography. Yes, the cryptography in the PS3 used to check the signature on software was cracked, and Sony's private keys (which are used to sign software for the PS3) were obtained.
  • In a possibly related hack, custom firmware enabled hackers to obtain free games and DLC from the PSN store. Why? Sony made the classic mistake of trusting the client software and assuming a certain variable in the PS3's firmware could never be modified.
  • The Playstation 4 seems to have an issue with the capacitive sensor in its disc eject button. Over time it tends to spuriously activate, leaving the console spitting out discs and then refusing to allow them back in, making it impossible to play any game that comes on a disc. The only official Sony solution? Replace the entire console under warranty, or buy a new one. Never mind that simply adding a software patch allowing the user to tell the software not to pay attention to the hardware button would be easy and fine (it's not a mechanical system and when you want to eject a disk it can be done through the system menus).

Multiple Brands, Sorted By Category

    Alternate Operating Systems; Bugs At No Extra Charge 
A free or low-cost operating system is great, but bad programming (or none at all) can ruin the fun quickly.
  • Linpus Linux Lite, as shipped with the Acer Aspire One. Now in fairness to Linpus, its GUI could not possibly be more intuitive (plus a boot time of just 20 seconds and recognizing out of the box xD picture cards as well as others beyond SD ones, if the computer has a reader that supports them). But there is designing a distro for complete beginners, there is designing a distro with several directories hard-coded to be read-only and Add/Remove Programs accessible only by some fairly complex command-line tinkering. That the sum total of its official documentation is a ten-page manual that contains no information that can't be figured out by an experienced user within five minutes of booting doesn't help as well as that updating it was a hell.
    • In Brazil, many low-end computers are sold with horrible Linux distros in order to claim tax breaks for using locally-developed software. Stuff which cannot be updated without major breakage, full of security holes, old versions of packages, and so on, to the point that it seems many people only buy them so they can install a pirated copy of Windows to save money. Same in Hungary because of a law prescribing that no computer can be sold without an operating system (the other loophole is paying the computer store for parts and assembly, so on paper not buying a computer).
      • Same in Italy, where identical laws have caused computers - modern ones, with multicore processors and several gigabytes of RAM - to be sold with FreeDOS.
  • While Linux is sometimes described as the most stable OS, the opposite was true for the 1.1 kernel, which was notoriously unstable and bug-prone, and had poor back-compatibility. Users quickly reverted to version 1.0 and waited for 1.2. This is the reason behind the convention of using odd version numbers only for beta releases.

     Automotive Annoyances 
It's times like this that may make you wonder if you'd rather drive a trusty classic car.
  • Automobiles are starting to gain internet connectivity, which means a hacker may amuse himself by forcing you to lose control of the vehicle. Maybe that vintage car is worth driving for now. Yes, this makes it possible to attempt murder with un-patched vehicles, no joking here. Thankfully, White Hats worked with Chrysler to prevent this security vulnerability.
  • One of the issues with modern cars (especially luxury models) can be the various computer systems that can cause error codes register on an O.B.D. 2 scan tools. The more computer parts involved in the vehicle's construction, the more likely something will eventually malfunction. Connector corrosion is a possible issue and can be enough to prevent your vehicle from even starting due to an electronic engine part malfunctioning. Some of these computer parts can experience a bad production run, and fail unexpectedly, leading to issues such a traction control system going into "fail-safe" due to breakage.
  • The Jeep Grand Cherokee, while not a perfect sport utility vehicle, has been a desirable model from Chrysler, matched or exceeded by the popular Jeep Wrangler. However, the 2011 model was a hardware equivalent of an Obvious Beta and also a horrific alleged car, due to the a device called a "totally integrated power module" (T.I.P.M.). The complaints are so frequent that Car Complaints has issued a waring to avoid this model like the plague. Terrible experiences include the fuel pump shutting off in transit, leading to power steering and brakes shutting off and a crash. Thankfully, Chrysler got a grip on the problem, and by the 2013 model, the complaints had been drastically reduced. For comparison, the 2010 model had vastly fewer complaints.

    Digital Rights Mismanagement 
Hey developers, we paid for your software and you will show the proper respect.
  • Dragon Age: Origins has a really bad bit with its installer where, when it asks for your CD key, you can use Task Manager (AKA "Ctrl-Alt-Delete") to stop one of the processes that is part of the installer and skip it. Under the DMCA, Task Manager is now illegal, alongside markers and the Shift key.
  • Games using SecuROM V10 or below will also fail to launch (without explanation) if they detect a process named "procexp.exe" (Sysinternals Process Explorer, a program provided by Microsoft), ostensibly out of fear that hackers will use it to reverse engineer their DRM process, even though Process Explorer is basically just a beefed up version of Task Manager, and Process Monitor is the one that reveals any action taken by any program whatsoever. There are two ways you can circumvent this:
    • Close ProcExp and reopen it immediately after starting the game.
    • Rename the ProcExp executable to anything else. Or be running a 64-bit version of Windows, where it's now called procexp64.exe.
  • StarForce, another Copy Protection program. It was so poorly written that upon installation, it would open huge holes in security, crash the computer's OS (Windows Vista in particular would require a reinstall) and could break the disk drive's hardware. For example, it would disable SCSI and SATA drives based on the OS' presumption that they were both virtual CD/DVD drives. It doesn't help that the latter is a common connector for newer disc and hard drives.
  • Norton was once the ultimate antivirus software, and the standard the others aspired to. Then they decided to focus on anti-piracy. The result: The virus scanner hardly works, and even when it does, the virus discovered cannot be expunged, in most cases. For this, Norton now has the ultimate pirate defense: it's such an awful program, anyone knowledgeable enough to pirate software is going to get the paid, more feature-rich version of a program with a free version, or just go the legal route and get either the free versions of either of those or the newer Microsoft Security Essentials, which are all free, yet somehow still far better than Norton.
    • With Norton 360, attempting to create a backup will take you to a section where you have to login with your Norton Account. The problem is, if you don't happen to know the account's password (like if your parents gave you the CD as one of the 3 or so allowed copies of the program), the window's exit and right-click-close buttons are grayed out, and Task Manager can't kill the application. You have to sign in properly or it declares your copy pirated despite providing the CD Key earlier.
    • Many antiviral suites such as Norton, Kaspersky, and McAfee have a certain yearly subscription which users must pay, or else the antivirus will stop protecting their computer. So far, so uninteresting. However, with many of these suites, when the subscription expires, the antivirus will continue scanning every webpage the customer visits, and be unable to clear it for viewing. This results in extreme slowdowns and general lack of connectivity, which can only be fixed by either uninstalling the program in question - or paying more for another year's sub. This is a brilliant marketing tactic right up until you realize that many customers have McAfee, Norton, or Kaspersky on their freshly purchased computers for the rough equivalent of one month, at which point their shiny new piece of hardware goes defunct and they become justifiably outraged by what functionally works as a scam.
  • Ubisoft's online DRM, which prevented players from playing their games when the servers went down.
    • It's starting to get worse with their UPlay software, which is simply another copy of Steam, like EA did with Origin. However, whereas Origin started Growing the Beard in 2013, UPlay is nowhere near polished enough for prime time. And it's their main DRM platform now.
    • At one point, their UPlay software had an remote code execution bug. One hacker built a proof-of-concept exploit that would launch Windows Calculator remotely simply by visiting his website.
    • Uninstalling a game that uses UPlay can cause massive memory leaks that will grind systems to a halt just with regular use - you have to use system restore to a point before you installed the game at all.
    • In 2013, UPlay was hacked and a version of Far Cry 3: Blood Dragon was downloaded, full and complete, a few months before its official PC release date. This forced Ubisoft to release the game around the same time as the console "exclusives".
    • On July 30, 2012, it was discovered by Google that UPlay is a rootkit. Whether or not this behavior has changed remains to be seen.
    • Say what you will about EA and Origin, but at least they have the decency to not sell their games on Steam if they want to force people to use their own program instead. You can still buy a Ubisoft game on Steam, but you still need to install UPlay to play it.
  • The PC version of Gears of War had a DRM certificate that would expire on January 28, 2009, making the game unplayable after that date. Luckily, Epic released a patch to fix this shortly after.
  • EA's smartphone version of Tetris requires an Internet connection. For a single-player game. And if your connection drops in mid-game, it'll kick you back to the title screen. Despite the fact that there are no features in the game that should even require Internet connectivity. Fortunately, this is subverted with the iOS and Android versions.
  • The copy protection for Melty Blood Actress Again Current Code has a truly insane activation procedure. You have to insert the CD, type in the CD key, and install the game. Then, remove the CD, try to run the game (which will give an error), and click a second button to connect to a website into which you must type a code the game gives you plus a second CD key from the game box. It will then allow you to download a file which you must place in the game's install directory - which requires going through elevation if you are using Windows Vista or later. Finally, you can then run the game... ack.
    • The "type CD key to install part" is especially hilarious, because you don't need to install - the game is stored as a folder on a DVD itself, unpacked. As many of the Japanese games actually are. You can just copy this folder to any destination you want.
  • Cross Beats, despite being a primarily single-player game (with some social elements), has always-on DRM that accesses the server for literally everything, including every menu transition. Perhaps not so coincidentally, less than 2 hours after launch, the whole game had to be temporarily taken down for "emergency maintenance". The official reason given was "user data load distribution processing", which was most likely PR-speak for "Oops, we accidentally DDoS'ed ourselves."
  • Even Steam is not immune to this. If you haven't run it in a while, you may discover that it's trapped in an infinite loop: it attempts to update itself, but aborts partway through because your internet connection faltered for a few milliseconds, and it quits without saving what it's already downloaded. You can't run Steam because it needs an update, and you can't update Steam because you can't run it. You'd think the solution would be to download the update outside of Steam, but Valve doesn't provide any way to do so.
    • The solution for this was to re-run Steam and go through the entire download process again. Of course, then your internet hiccups again, and you spend half of the day repeating this over and over until you get lucky and the update completes without your internet hiccuping even once, just so Steam is updated and you can get into the damn game you paid for.
  • The process of installing Battlefield 3 can be used as argument as to why EA should stop forcing its Origin platform on their physical releases and didn't really provide a good first impression of the service. The first is that Origin treats installing from a DVD as "downloading" the game, which feels kind of odd when you can't pause and resume it. The second is you cannot quit Origin while downloading and installing an update, which is something Steam can do. If you want to run the game, you also need to logon to their Battle Log website. And lastly, if this is a clean install, you have to install a plugin for your browser so the website can launch the game (which sounds kind of fishy). But overall, you do not launch Battlefield 3 from Origin, you launch it from the Battle Log website (although if the service is offline, launching the game will take you to the single player part).

    Games That Play Dumb 
There are times when even a Core I7 processor will meet its match, and there are times when even an amateur programmer will shake their head in disgust.

Note: Consider if the entry would fit more into Obvious Beta or Game-Breaking Bug.
  • Jan Ryu Mon, an online Mahjong game by PlayNC, which has plenty of evidence suggesting it was programmed by drunken monkeys:
    • If a cookie gets blocked from the site (most notably the registration page), the server gives a "Fatal Error" page with a stack trace - no indication whatsoever that the problem is a blocked cookie.
    • The login only works in Internet Explorer - attempting to log in on any other browser will result in a "Please use Internet Explorer 6.0 or higher to log in" pop-up message as soon as you clicked on the username or password fields, and then repeat the pop-up every time you try to type a letter, click on one of the fields, or click the "Login" button. There is absolutely nothing on the site that doesn't work in Firefox, except...
    • Even though the game ran in a separate .exe file, it would give an error message on startup if the game executable was launched directly. The only way to launch it to download the ActiveX extension (for IE only) from the game's site, log in, then use a button on the site to launch the extension for the sole purpose of launching the game - the extension would pass along your login information to the game, and they apparently didn't think to make the game executable ask for a login.
      • Scary fact: this is becoming an extremely common way to handle log in "securely". Even Final Fantasy XIV is using a variation with an embedded IE session in a "login client" window. Try loading quite a few MMOs with your Internet connection offline and watch the fun.
    • Then when you logged in, the game was ridiculously slow because of the eye-candy animations for every single turn. This was further exacerbated by the graphics engine, which was so inefficiently programmed that the game experienced more lag and frame-skip to show a Mahjong table with one hand moving one Mahjong tile than Touhou games do trying to animate 2000 bullets simultaneously. This is often caused by graphics engines which cache nothing or very little, instead opting to re-render most or all of the screen from scratch on every single frame.
    • To add insult to injury, their server had major routing issues during the beta (and still have some as of this writing), forcing many players to go through a proxy just to connect, which also meant a LOT of lag - a round would be easily half an hour, when many other online Mahjong games can finish a round in 10-15 minutes.
    • And then there's a plethora of random crazy Game Breaking Bugs.
    • Plus the game occasionally deals a hand of 13 Haku tiles, even though there's only 4 of each tile in a set.
  • Big Rigs: Over the Road Racing. It's an obvious pre-alpha build of a game, with no collision detection, no AI and some incredibly bad programming of physics (as in, go as fast as you want – Warp 9000, if you have the patience – but only in reverse). Despite being obviously an alpha, the creators tried to sell it as it was anyway.
    • In one of the strongest cases of "why bother?" in history, they added AI in a patch... which causes the truck to drive in a fixed course at a rate of 1mph and stops short of the finish line. The reason why the opponent stops is because it has finished the race, but there is no loss condition in the game. That's right, the game simply doesn't allow you to lose.
    • Something in the game's code causes it to be a severe memory vampire - it uses half the available RAM when it's running. No one knows why. Lord knows nothing in the game itself supports its needing anywhere near that amount.
  • Sonic the Hedgehog (2006) is an Obvious Beta – probably the most infamous example from a AAA-publisher – rightly criticized for all manner of things. But probably the biggest complaint is its Loads and Loads of Loading, particularly in the hub worlds. This is because the game loads the entire hub world whether it's needed or not… even for a single line of dialogue.
    • In several cases, talking to an NPC would cause the game to reload the entire hub world from the disc - even though you were already in the hub world. Once it's finished, the NPC would speak a single line of dialogue, and then it would load the entire hub world again. This is subverted on one of the missions in Shadow's story as you are given the instructions first, and then it loads once. You'd think they let all the missions to be like this by bypass the very ludicrous and frequent load times.
    • This issue is demonstrated well during one of the boss rounds—one bug will let you explore the whole hubworld, even though only a small portion of it is used in the fight. Of note, the textures for certain locations won't load if you do, but the coding will.
  • Action 52 for the NES is already a masterpiece of broken game design, but one of the key factors for some of the glitches that can happen in the game are caused by the mapper in the cart, not just in the coding. Action 52 uses Mapper 228, which is unique as it's similar to the ones found in most bootleg multi-carts. Every game made for the NES has different types of mappers, and some of them have their own coding that can pull off some effects, like in the Japanese version of Contra, there's snow blowing in the mountain level, which from removed in the international versions to cut the costs. If you were to use a mapper, you need to be aware that the console can only read information from it in a specific way, or else you'll run into problems. Even then, it's not guaranteed to work because for the most part; the NES isn't made to read unlicensed mappers, especially if you were to program something in a certain way that the NES couldn't handle. One notable example regarding this issue is that Alfredo and the Fettucini and Jigsaw always crash when you try to load them normally, although most games on the cart start just fine. Why is that?  Of course, Action 52 is already infamous as it is for boasting so many games in one cartridge, and not a single one of them got past Obvious Beta or is even considered acceptable for alpha standards.
    • The mapper issue is the same reason why the unreleased Cheetahmen II stops at Level 4, as it uses the same cart as Action 52. While this is fixed in unofficial patches and the still incredibly buggy re-release, if by some miracle you own the original, you need to fiddle with the cartridge upon inserting it into the system so it'll boot up on the inaccessible levels.
  • The Tetris: The Grand Master clone Heboris – specifically, the unofficial expansion – has more or less died out, because attempts to peek into the source code, much less make any further modifications, have proven futile due to the game being a messy hybrid of a gaming scripting language and C++ .
    • On a related note, there were a handful of genuine C++ ports of it. However, the MINI version (which allows for "plugins" to define additional game modes and/or rotation rules) is the most commonly-used version, and the way it works pretty much inhibits any attempt at porting it entirely to C++ .
  • Zero Gear isn't problematic in and of itself... but the nature of its Steam integration allows it to be used to play any Steam-locked game you want, without owning the game. This is most notably used by hackers to bypass VAC bans: just start a new account, download the Zero Gear demo, and copy the files over.
  • The save system in Pokémon Diamond, Pearl and Platinum. Making any changes in the PC storage system, let alone simply opening it up increases the save time from 5 seconds to 15 seconds due to an overly secure encryption system that makes sure that every slot in the storage system isn't corrupted through a hefty checksum review. Are you the type that likes to save regularly? Sucks to be you! The sad thing is, it saves the game the same way Pokémon Ruby, Sapphire and Emerald used to do, and those games consistently took about five seconds to save.
    • Not to say that they did nothing wrong either. Pokémon Emerald has a flaw in its RNG system that causes a new personality value seed to be generated per save file, not per boot-up. The end result; once the seed is deciphered, the player (so long as they don't start a new game and save over their file) is left with a predictable sequence of personality values to potentially break the game with.
    • As innovative Pokémon X and Y brought to the table, it had their fair share of failures. The profanity filter is notoriously arbitrary, blocking innocent nicknames because they contain profanity in other languages (e.g. "Violet" because "viol" means "rape" in French). This is particularly bad because some of the censored words become legal under certain circumstances (e.g. having any other character attached to either end of the word) while others don't have that distinction. The companion app Pokémon Bank is also notorious for letting through hacked Pokémon (which it's supposed to block) while rejecting legitimate ones. And of course, at launch, Game Freak forgot to encrypt the games' wi-fi communications, enabling unscrupulous players to use a program to identify what moves their opponent is using each turn in a match; the version 1.2 patch (mandatory for online communication) fortunately fixed this oversight.
  • Sometimes, Dungeon Fighter Online gives you a broken page showing on the launcher, saying that the navigation to the page was canceled. Since the entire program is based on Internet Explorer, even if you use the Firefox plugin made by users at DFO Nexus, it won't access the page, giving most of the time a timeout error. The worst part, however, is that it could not only break the login page, and a bug regarding SSL certificates could show you the login to pages that are not from Neople, such as godaddy.com or datameer.com. Fortunately, this stuff seems to be fixable by resetting the router.
  • Indie game developer and utter egomaniac MDickie has released the source code for many of his games, including the infamous The You Testament. By looking at the code of the latter, you discover for instance that the "exit program" function works by deliberately crashing the game. And that's just the tip of the iceberg.
  • The open-source Windows port of Syndicate Wars eats insane amounts of disc space for no logical reason.
  • Death Trap is already an infamous point-and-click horror game, but one of more notorious aspects about it is how the game works. In ActionScript 2note , there are two commands called "gotoAndPlay(frame number)"explantion  and "gotoAndStop(frame number)".explantion  These two bits of code are meant to be used for the last frame in buttons called "on(release)", and you could only use one of the two bits of code or else it won't work. These actions are not mandatory for Flash projects, but the reason why it's under Idiot Programming is because they're misused. They're supposed to be used for really basic stuff, like buttons in menus; but they're not meant for more complex stuff like, say... movement buttons in point-and-click games; you can guess that the author used them because of how simple they were. Because of the fact these two commands work with the main timeline, it's impossible to make a traditional point-and-click game using this set up and expect it to work without making it really linear. And that's exactly what the author did, he made the game the way it was.
  • Back in the days of DOS and Windows 9x, many games (such as the PC version of Slave Zero, as Lowtax discovered in one of the first articles he ever wrote for Something Awful) were hard-coded to assume that the CD-ROM drive was D:, rather than actually bothering to check with the OS to see that this was the case. If you had more than one hard drive, or had your disk partitioned — which a lot of people with >2GB hard drives had to do prior to Windows 98 arriving on the scene — you were generally out of luck unless you could either edit the game's configuration files or find a No-CD patch.
  • Magicka is a very fun game, but sometimes it's so difficult to work with that it's almost not worth the trouble.
    • The game takes almost two minutes to start up. That's before the company logo shows up, too. You just sit there staring at a black screen for two minutes. (Thankfully, you can skip straight to the title screen once the company logos do start appearing.)
    • The graphics are nice, but that's because of its superb art direction. Technology-wise, it's a fairly standard 3D top-down brawler... And yet it chugs down resources like a cutting-edge first-person shooter. A laptop that runs Team Fortress 2 and Left 4 Dead can't keep up with Magicka.
    • If someone's connection drops during a multiplayer game, there is no way for them to re-join. The remaining player(s) must either go on alone, or quit and go back to the lobby.
    • The developers acknowledged the game's disastrous launch with a series of blog posts and weeks of patching that made the experience more bearable. Then, with their characteristic and warped sense of humor, they introduced a free DLC called "Mea Culpa" that gave each player character a set of gag items: A "Patched" robe, a "Buggy" Staff, a "Broken" Sword and a "Crash To Desktop" magick.
  • The developers of the Anno Domini series need to be slapped with some basic GUI guideline books. For example, the first game would only save after you hit the save button, but not just after naming the save game. It would also completely remove the game directory on uninstall, including save games and settings. Anno 1503 (the second game), never got its promised multiplayer mode. Anno 1404, released in 2009, still assumed that there would be only one user, and that this user would have admin rights.
  • The installer for Duke Nukem Forever seems to have been programmed by someone with an "everything but the kitchen sink" mentality. Not only does it install a bunch of applications and frameworks that the game doesn't actually use, but it installs the AMD Dual-Core Optimiser, regardless of whether your CPU is multi-core, or even made by AMD.
  • EVE Online had a stellar example of what not to do in programming with the initial rollout of the Trinity update. Eve had a file called boot.ini that contains various parameters... but it's also a critical system file stored in C:\. A typo in the patch caused it to overwrite the version in the root directory, not the EO folder, resulting in an unbootable system that had to be recovered with a rescue disk. This is why you never name your files after something that already exists in the OS. (Since that debacle, the file in question has been renamed start.ini.)
  • Valve took it Up to Eleven by making its Linux Steam client delete all files on the machine it's running on if its directory is changed.
    • To make matters worse, Steam's uninstaller also deletes everything in the folder it's on.
    • When adding a game library that already has games on it, Steam may delete the entire thing and start over. Even if you have all your games on it.
  • Myth II: Soulblighter had an uninstaller bug that was discovered after the game had gone gold. If you uninstalled the game, it deleted the directory in which the game was installed. It was possible to override the default install settings and install the game in the root directory of a given drive. Fortunately, only one person suffered a drive wipe as a result (the person who discovered the bug), and they actually replaced the discs after the copies of the game were boxed, but before the game was shipped. Still, it was a fairly glaring blunder.
  • Diablo II has fairly simple mechanics due to its nature as an online title released in 2000. That did not prevent Blizzard from introducing bugs in literally every skill tree and about 20% of all skills in the game. Bugs range from the curse resist aura getting weaker as you put points into it and a rapid fire melee attack that misses completely if the first swing misses to masteries claiming to increase damage on spells and items but not actually doing it, Energy Shield bypassing resistances meaning your mana is drained in 2 fireballs, and homing projectiles going invisible if fired from off screen. Lightning bolt spells ignored faster cast rate items for no particular reason. Berserk correctly set your defense to 0 when you used it, then if you used it again it would give you negative defense and after a while it would roll around and give you 8 million defense. Numbers were wildly off on release: the high level Lightning Strike spear skill would do a total of 50 damage at maximum spell level, and the poison from reanimated skeletal mages would do 1 damage per second over the course of five minutes. And that's just spells: there were also numerous dupe bugs, ways to teleport to locked maps, the list goes on.
    • Infamously the game was only considered difficult for three reasons: about half of the combinations of random enchantments a boss could have would interact in bugged ways and result in an instakill in some way (fun example: the combination fire enchanted/lightning enchanted would erroneously add the (huge) damage from the fire enchanted death explosion to the damage of every single one of the 10+ lightning sparks emitted every single time the boss was struck), the poison clouds of claw vipers would invisibly deliver their melee attack 25 times per second resulting in a RRRRRRR sound and a very quick death, and gloams drain a slight amount of mana on attack but also seem to deal 256 times that amount as damage whenever they hit you with anything.
  • Diablo III had a particularly funny example of this. When the game launched, the auction house gave you a 5 minute timer to cancel your auctions and beyond that they were stuck in the auction house for two days (A patch was added in the future to allow canceling). The fans found a workaround to cancel their auctions which involved setting your computer's time to the day before you put the auction up and the cancel timer would appear again. Why was it programmed to use your computer's time instead of the Battle.net servers' time we will never know.
  • Space Empires V, unlike all its predecessors, is notorious as a resource hog, bringing even the mightiest of PCs to their knees, despite running only on Direct X 8:
    • The battle replay files generated by the game, instead of storing "Ship A shot at ship B and hit it for 15 damage to the armor.", instead store "Ship A fired bullet X. 50 milliseconds later, bullet X has moved a few pixels. After another 50 milliseconds, bullet X has moved a few more pixels. Repeat ad infinitum. Bullet X hit ship B for 15 damage to the armor." Thus, battle replays are frequently in the tens of megabytes - per turn!
    • Every time you load the ship list screen, the game loads into memory all the data about all the ships in the game, regardless of if it is actually going to be used in the list's current view mode, delaying the screen's loading by upwards of a second. Fortunately you can disable the loading of some of the larger bits of data, such as order queues, but then you can't see it in the list when you bring up the views that use it...
    • The game's developer apparently has no idea about the existence of multicore CPU's, and made the entire game single-threaded; thus, running the game on a 2GHz quad-core will be noticeably slower than running it on a 3GHz single-core.
    • Processing a turn for a multiplayer game when the game is set to fullscreen mode is much slower than processing the same turn when the game is set to windowed mode, in which the game doesn't bother drawing anything, but just processes in the background. Surely rendering a progress bar can't be all that taxing? (In fact, everything is slower in fullscreen mode for some reason... isn't that kind of backwards?)
    • The game has a scripting language which you can use to write random event and espionage scripts... but the language is horribly broken in the way it handles order of operations. Calculating 5 + 3 * 4 will give you 32, not 17 - but that's just the tip of the iceberg. Somehow you can manage to take 1, then repeatedly subtract and add 1 from it alternatingly, and wind up with any negative number you want, simply by repeating the subtraction-addition cycle the right number of times!
  • The graphics engine for Need For Speed: Most Wanted (2012) literally cannot run above 30FPS without massive framerate drops. The framerate drops have no correlation to the amount of action being rendered on-screen. There are framerate drops in the loading screen animation, where there are no polygons rendered whatsoever. This was discovered by people who noticed that locking the framerate to 30 in the PC version's configuration file eliminates the gratuitous slowdown that the PC version of the game is known for. But here's the thing though: The game's graphics engine is a modified version of the graphics engine that powered the last Criterion NFS game, Hot Pursuit (2010), which could run perfectly fine at an uncapped framerate. How could a company in charge of a big-budget racing game series modify an existing, completely proprietary graphics engine and accidentally break the engine's high-framerate capabilities?
    • In Need for Speed: Rivals, meanwhile, the (locked at 30 FPS) framerate is directly bound to the real speed of the game. Setting it to 60 FPS through third-party software makes the game run at double speed, resulting in screwy behavior of all sorts including broken physics.
  • Welcome to Ponyville was a relatively short My Little Pony: Friendship Is Magic visual novel, and yet it was over 1GB in size. This was apparently due to the fact that the programmers appeared to have no knowledge of audio compression: for example, a brief file for a closing door was over 32MB (with several minutes of silence included in the file to boot). Some have estimated that if the audio files were properly compressed along with the sprites the game could have been 60-90MB.
  • The original PlanetSide, a 2003 MMOFPS with up to 111 players on each of the three empires fighting on each individual 20x20km continent, handled everything clientside, with the game only checking up with what the server knows every couple seconds. A significant amount of the gameplay was devoted to exploiting the hilariously bad netcode ("ADADA" strafing relied on the game not realizing when players shift their strafing direction, causing them to teleport from side to side on other player's screens). Because everything was clientside, the game was also one of the easiest MMO games to hack. Common hacks included warping all players on a continent to stand directly in front of you (on your screen) and kill them, causing players from across the continent to spontaneously fall over dead, or enabling a hack to disable weapon cone-of-fire, resulting in sniper shotguns and Gatling guns that could pick off players from a thousand meters out. Even without hacks, the game was pitifully easy to exploit. Players would stand somewhere safe, unplug their Ethernet cable, then run around the battlefield knifing enemy players to death. Once the player is happy with the amount of kills, the player would plug the cable back in and bask in huge amounts of XP as half the battlefield drops over dead - though this was eventually fixed by locking the game's controls after a few seconds with no server communication. For a long time, having a certain type of dual-core processor would cause the game to run in turbo mode, causing vehicle and players to move and fire several times faster than normal. Thankfully, Planetside 2 uses much more serverside authentication, which has fixed almost all the issues with the first game's idiotic netcode.
    • The original Planetside is still refered to as 'Clientside' to this day in some fan circles
  • The Android port of the classic SNES RPG Chrono Trigger has a rating of 3 stars on Google Play. How, you may ask, does such a beloved game get such a low rating? The reasons are many. First off is the file size. Chrono Trigger was completely rewritten for smartphones to include touch-based commands and movement. The result is a 37 MB app (the original SNES game was just a few megabytes in size). Another gripe is the game's very broken DRM system - to play the game, the user is required to have an Internet connection so that the app can authenticate itself and download the various areas of the game. That's right - this 37 MB app contains no game content whatsoever, apart from the title screen. What's more, the game automatically quits (more like crashes) if Internet connection is lost. Without saving the game. Many a Chrono Trigger player has lost an hour or more of game time when his or her device switched from 3G or 4G to a home Wi-Fi network (phones momentarily cut off Internet when they change from 3G/4G to Wi-Fi). Even more egregious is the fact that the game is incompatible with many popular devices, even the Samsung Galaxy S3, a phone much more than capable enough to run this game. In short, this port of Chrono Trigger is perhaps the worst console-to-mobile port ever.
    • The nail in the coffin? The Nintendo DS port of the game had already existed for years, set a precedent on how to use touch controls (albeit with a second screen), and even with several hours of additional content left a lot of empty space on a DS game card. There is absolutely no excuse.
  • The initial rollout of The Witcher had atrociously bad back-end code. This is best seen when upgrading to the Enhanced Edition. Despite the latter adding no small amount of content, the same computers ended up loading as much as 80% faster than before. That's the difference a little optimization can make.
    • Playing Poker Dice in the first title is heavily prone to crashing PCs. Not just the game itself, but the ENTIRE operating system, to the point where it forces a reboot.
  • The Sims 3 is known for having a large number of expansion packs...and for having a horribly wasteful method of installing them. Every time you install an expansion pack, another copy of an identical set of help files (in all languages) is installed on the machine. Plus, if you uninstall Sims 3 itself, it removes the majority of the actual content of the expansions - but not these extra help files, for which you must run the expansion pack's own uninstaller.
  • The Kickstarter-funded Ouya console has been identified as having a huge raft of bad ideas:
    • As with many console systems, Ouya gives the user the option of funding their account by buying funding cards at retail, which provides codes the user can type in to add money to their account. Unfortunately, the Ouya will not proceed beyond boot unless a credit card number is entered, making the funding cards a pointless option.
    • When an app offers an in-app purchase, a dialog is displayed asking the user to confirm the purchase - but no password entry or similar is required, and the OK button is the default. This means that if an app offers a purchase while you are pressing the button quickly, you may confirm it accidentally and be charged for an in-app purchase.
    • The system was touted from the beginning as open and friendly to modification and hacking. This sparked considerable interest, and it became obvious that a sizable part of the supporting community didn't really give two hoots about the Ouya's intended purpose as a gaming console; rather, they just wanted one to hack and make an Android or preferably Linux computer out of. The Ouya people - who, like every other console manufacturer, counted to make profit more from selling the games than the hardware - promptly reneged on the whole openness thing and locked the Ouya down tight. The end result was a single-purpose gadget that had a slow, unintuitive and lag-prone interface, couldn't run most of the already-available Android software despite being an Android system, and didn't even have many games that gamers actually wanted to buy.
    • Also, the HDMI output is perpetually DRMed with HDCP. There's no switch to disable it, not even turning on developer mode. People who were expecting the openness promised during the campaign are understandably angry for being lied to, as are those hoping to livestream and record Let's Play of the games.
    • Even in its intended use the Ouya has disappointed its users. The main complaint, which still occasionally appears on /r/ouya when someone gets one second-hand, is that the controllers are laggy. On a console with mostly action-packed casual games, this is very bad. Apparently it isn't even a fault of the console itself, as the controllers exhibit the same lag when paired to a computer. Opinions differ on whether it was just a large batch of faulty controllers - apparently not everybody has this issue - or a design flaw that came out during beta testing but was knowingly ignored and quietly corrected in subsequent batches.
  • Splinter Cell Blacklist is plagued with issues, such as poor multiplayer support because people can't seem to connect to one another (and everything goes through Ubisoft's UPlay service) and where the game's saves are stored. You would think if you regularly back up your saves that they would live in the My Documents folder or the App Data folder. Nope, they live where UPlay is installed, usually in the Program Files folder. And to make matters even worse, the permissions for the saves folder is left open to anyone. Which wouldn't be a problem... if you don't consider everything in Program Files requires admin privileges to write files into...
  • The notoriously bad Super Mario World Game Mod Super Mario Superstar 1 has a weird example of this in the final battle. Not only does the final boss somehow completely fail to do absolutely anything (to the point no one can figure out what it was supposed to do) and make it Unwinnable by Mistake, but somehow, via some absolute miracle of shoddy programming... the boss' graphics seem to glitch if you enter its room through one door and not the other. As in, you go through one level ID and it appears correctly (with bad Microsoft Paint style graphics), you go from another level ID and the graphics are glitched tiles. No idea how the programmer pulled that kind of fail off.
  • Early in the life of the Xbox 360, many gamers used the console's optional VGA cable to play their games with HD graphics, as true HDTVs tended to be rare and expensive back then. PC monitors at the time usually had a 4:3 aspect ratio, which most game engines were smart enough to handle by simply sticking black bars at the top and bottom of the screen. However, some engines (including the one used for Need for Speed Most Wanted and Carbon) instead rendered the game in 480P — likely the only 4:3 resolution they supported — and upscaled the output. Needless to say, playing a 480P game stretched to a higher resolution (usually 1280x1024) looked awful, and arguably even worse than just playing it on an SDTV.
  • Call of Duty: Ghosts is apparently a sloppily developed game. It weighs in at a hefty 30GB and requires 4GB of RAM (8GB recommended). You'd think this game would be amazing on all fronts... until you find that the game might be a little bloaty, will happily eat your RAM (it even stutters on the loading movies after the level is done loading), and looks no better than previous installments. Compare this to the Tech Demo Game Trope Codifier Crysis which runs just fine on the recommended requirements (a "wimpy" dual core processor and 2GB of RAM), it just needed a graphics card from its future (and even then, only 2 generations ahead) to run on absolute max settings at 60FPS.
  • The online play in Meteos Wars did not seem to take lag into account whatsoever: As a Falling Blocks Puzzle Game, it only needs to keep track of the other player's incoming blocks and controller inputs. Instead, it seems to send data between players about the exact locations of every block as they're falling, resulting in much more lag than is necessary. If the connection becomes unstable, instead of taking guesses and correcting itself later, as is seen with all other falling-blocks puzzle games with online play, it just stops the action outright until the signal restabilizes, or terminates the match outright if it's taken more than a few seconds. All the while, as the online play shifts between slow and immobile, the timer counts down in real time, functioning totally independent of the game, resulting in nearly every online match ending in a time-out instead of elimination. Geographic distance seems to profoundly affect the lag too: The closer the opponent is to you, the less lag there is, and it plays almost normally if your opponent is within about 300 kilometers. Inversely, the game is almost unplayable if up against someone from another continent.
  • When it was announced that the PC version of Titanfall would require a massive 48GB of hard drive space, many assumed that it was a sign of developers finally starting to include visual assets that far exceed those of typical 360/PS3 ports. As it actually turned out, a whopping 35GB of the game's installation was taken up by uncompressed audio files, in every single one of the game's twenty languages. The developers' rationale for this was because they thought the audio decompression would take up too much CPU power on older and/or bargain-basement PCs... never mind the fact that any CPU which struggles with anything as simple as audio decoding probably wouldn't be powerful enough to run the game in the first place. Had they compressed the audio files and actually made separate installers for different languages, the game install would probably only be about 15GB.
  • Fargus Multimedia's Russian bootleg of Putt-Putt Saves the Zoo is a complete collection of failure. Alongside having enough problems as it is (characters commonly speak in the wrong voices and their lips keep moving after finishing a line of dialog), they made, quite possibly, one of the biggest fails possible — the game was originally completely unplayable. Why? They packaged the game with a blank W32 file, the file that executes the game. It wouldn't be until 2004 that fans would fix this by using torrents to grab an American W32 and stick it into the Russian version.
  • The DOS versions of Mega Man 1 & 3 were programmed to base the speed they ran at on that of the computer's processor, with no upper limit. This isn't an inherently poor choice, as most DOS games did the same, not predicting the speed at which technology would improve. What sets it apart from those games is that while they run too fast to play on modern computers, Mega Man 1 & 3 ran too fast to play on computers on the market at the time of their release.
  • don't take it personally babe, it just ain't your story closes by forcing itself to crash.
    • As does The You Testament and other games by Matt Dickie.
    • The version of Planescape: Torment on the Dungeons and Dragons Anthology collection does the same thing, which is odd, given the otherwise fine carryover job.
  • Super Monkey Daibouken. Despite being for the Famicom, as opposed to the Famicom Disk System, it had to load regularly. And that's to say nothing of the game's battle system, which is too inconsistent to be remotely coherent.
  • Hoshi wo Miru Hito had a world map so badly programmed that the display just spat out tiles at random. The display never showed your total hit points, and the battle menu was so broken that you couldn't even access it without progressing in the story.
  • LEGO Island 2: The Brickster's Revenge is infamous for its Loads and Loads of Loading, on par with the likes of Sonic the Hedgehog (2006). The explanation behind this? The game prioritizes rendering the loading screen over actually reading any data. This means that for every frame that is rendered, a small amount (as in, a single byte!) of information is read, which causes the game to spend an eternity loading things for absolutely no good reason. The kicker? This can be fixed by changing a single instruction in the game's EXE file, which instead changes the logic to read all the data before rendering a single frame, and the loading is minimized to only half a second rather than minutes on end.
  • The netcode for the original Dark Souls is infamous for being terrible. It can take upwards of a half-hour to summon a friendly player, and sometimes the game will think that you're being invaded by an unfriendly player, blocking you from exiting an area for upwards of ten minutes until the problem resolves itself. This isn't even getting into the PC port, which uses the much-maligned Games For Windows - Live.
    • Dark Souls II attempted to fix this problem by adding dedicated servers for matchmaking, which did alleviate the issues somewhat. Unfortunately, the dedicated servers are only used for matchmaking; once you actually connect everything is handled as a peer-to-peer connection, meaning astronomical lagspikes and severe Hitbox Dissonance if you happen to connect to another player across the world.
    • And it seems like From Software didn't quite learn their lesson the first time; Bloodborne's netcode is almost as bad as the original Dark Souls and has all the same issues. It also had terrible environmental optimization before patches, leading to Loads and Loads of Loading.
    • Dark Souls 3 had an anti-cheat software so poorly programmed at launch that it would often mistake the coding in the game itself as cheats, ejecting customers who had done nothing at all from online play.
  • Ah, Minecraft: a beautiful world that rewards creativity, a multiplayer experience like no other, endless exploration to wherever your heart desires, and the uncanny ability to bring modern quadcore computers to their knees. How is this possible? How does a game with graphics so simple they should run fine on a Pentium 2 require a modern gaming box to play at 1080p without stuttering - let alone if you dare to apply graphic-improving mods? Well, let's just say there's a reason nobody writes 3D games in Java. There were even third-party attempts to recreate the game in C, but obvious legal issues caused them to be dropped, to the groans of competent programmers the world over.
  • The netcode in Jojos Bizarre Adventure All Star Battle is terrible, resulting in mind-blowing amounts of lag no matter how good your connection is. This thankfully doesn't affect Campaign mode (where you fight other players "avatars", and through which much of the content is unlocked) because that only uses the connection to look up what other players have set as their characters (the actual fight is you versus the AI).
  • Dragon Age: Inquisition was loaded with bugs from the very beginning. The most unparalleled among them, however, is the notorious Patch 4. It's bad enough that the whole thing was nearly seven gigabytes large, but there's a programming error in the Xbox One port which causes it to uninstall and reinstall the entire game.
  • Banzai Pecan: The Last Hope for the Young Century is not without its problems, however none were as egregious as the palette-swap feature for its main character. Without using them, the game usually takes roughly 4 seconds to load the game, but if you use any of the palette-swaps (save for her Awakened Mode), it extends the load times from 4 to nearly 16 seconds. What makes this even more ironic is that the game can load the most basic enemy type the game in many other colors without any issues.
  • Grand Theft Auto IV: The i386, Microsoft Windows, port of the game is commonly cited as an example of this trope before patches were released. Dog-slow performance on the most commonly available computers of the time, and a graphical settings menu where setting the quality to low actually made the game run slower. True, patches help alleviate theses problems, and the game engine boasts its share of game mods, but the game became a textbook example of shoddy PC ports. Also, the game required the infamous Games For Windows, a program that is discussed in Microsoft section of this page.
  • The remake of Cannon Fodder recreates the game with a modern 3D engine - which takes an incredibly long time to load, even on modern fast computers, and doesn't keep loading in the background if you alt-tab out of the process. Click start, go brew a cup of coffee, lazily mix it with sugar and milk, drink it while you read the paper, then come back and if you're lucky the game will be just about ready to start. And for all that, the graphics are really nothing to write home about.
  • Sega, with their Mega Drive/Genesis titles on Steam, has finally opened up modding support of them through Steam Workshop, much to the rejoice of fans everywhere. However, it was soon discovered that for some reason, uploaded mods that utilised SRAM support wouldn't save their data... unless the mod was listed under Sonic 3 & Knuckles. And even then, it will not save all data properly if the mod's SRAM data is bigger than the aforementioned game's, as seen with Sonic 3 Complete.

    Hardware That Wears Hard on You 
You'd think these problems would be ironed out before launch, especially if made by a Mega Corp.
  • Some more examples courtesy of Sony: (Are Microsoft and Sony having a "anything you can screw up, we can screw up worse" competition or something?)
    • First, the first batch of PS2's were known for starting to produce a "Disc Read Error" after some time, eventually refusing to read any disc at all. The cause? The gear for the CD drive's laser tracking had absolutely nothing to prevent it from slipping, so the laser would gradually go out of alignment.
    • The original model of the PSP had buttons too close to the screen, so the Einsteins at Sony moved over the switch for the square button, without moving the location of the button itself. Thus every PSP had an unresponsive square button that would also often stick. Note that the square button is the second-most important face button on the controller, right before X; in other words, it's used constantly during the action in most games. Sony president Ken Kutaragi confirmed that this was intentional, conflating this basic technical flaw with the concept of artistic expression.
    • And before you ask, yes, this is a real quote sourced by dozens of trusted publications. The man actually went there.
    • Another PSP-related issue was that if you held the original model a certain way, the disc would spontaneously eject. It was common enough to be a meme on YTMND.
    • The original Playstation wasn't exempt from issues either. The original Series 1000 units and later Series 3000 units (which converted the 1000's A/V RCA ports to a proprietary A/V port) had the laser reader array at 9 o'clock on the tray. This put it directly adjacent to the power supply, which ran exceptionally hot. Result: the reader lens would warp, causing the system to fail spectacularly and requiring a new unit. Sony admitted this design flaw existed... after all warranties on the 1000 and 3000 units were up and the Series 5000 with the reader array at 2 o'clock was on the market.
  • The infamous "Red Ring of Death" that occurs in some Xbox 360 units. Incidentally, that whole debacle was blown way out of proportion, no thanks to the media (it's important to note, however, that Microsoft released official numbers stating that 51.4% of all 360 units were or would eventually be affected by the issue listed below). While there were an abnormally high number of faults, once the media reported the problem, people everywhere started sending back perfectly functional consoles – for the record, only "three segments of a red ring" meant "I'm broken, talk to my makers". Other red-ring codes could be as simple as "Mind pushing my cables in a bit more?", something easy to figure out if you Read the Freaking Manual.
    • However, the design flaw that led to the fatal RRoDs was at the very least a boneheaded decision on Microsoft's part, and at worst, proof that They Just Didn't Care about making something reliable. Basically, the chip containing the graphics core and memory controller got exceptionally hot under full load, and was only cooled by a crappy little heatsink. This led to the chip in question actually desoldering itself from the motherboard after a while, and people who opened up the cases on dead units actually reported the chip falling out of the console after removing the heatsink.
      • The heat sink was a lot larger but shrunk to make room for the DVD drive. You think they would've tested it after making a risky design choice like that.
      • A more plausible explanation is that the solder joints weren't very reliable under repeated thermal stress. Eventually they crack. The same thing happens to first generation Playstation 3 models (the Yellow Light of Death), albeit much later. Whoever was commissioned to do the assembly of both the 360 and PS3 must've had a grudge.note 
    • The 360 has another design flaw in it that makes it very easy for the console to scratch your game discs if the system is intentionally or unintentionally moved while the game disc is still spinning inside the tray. The problem is apparently so insignificant amongst most Xbox 360 owners (though ironically MS themselves are fully aware of this problem), that when they made the Slim model of the system they fixed Red Ring issues (somewhat) but not the disc scratching issue.
      • Most mechanical drives can tolerate movement at least. It's not recommended (especially for hard drives, where the head is just nanometers away from the platter), but not accounting for some movement is just bad. Anyone that has worked in a game-trading industry (such as Gamestop/EB Games) can tell you that not a day goes by without someone trying to get a game fixed or traded in as defective due to the evil Halo Scratch.
      • Microsoft recommends to not have the Xbox One in any position other than horizontal because the optical drive isn't designed for any orientation other than that.
    • Most of 360 problems stem from the inexplicable decision to use the full-sized desktop DVD drive, which even in the larger original consoles took almost a quarter of their internal volume. Early models also had four rather large chips on the motherboard, due to the 90-nm manufacturing process, which also made them run quite hot (especially the GPU-VRAM combo that doubled as a northbridge). But the relative positions of the GPU and the drive (and the latter's bulk) meant that there simply wasn't any room to put any practical heatsink! Microsoft tried to address this problem in two separate motherboard redesigns, first of which finally added at least some heatsink, but it was only third, when the chipset shrinking to just two components allowed designers to completely reshuffle board and even add a little fan atop the new, large heatsink, which finally did the problem away somewhat. However even the Slim version still uses that hugeass desktop DVD-drive, which still has no support for the disk, perpetuating the scratching problem.
    • The circular D-Pad on the 360's controller. Anyone who's used it will tell you how hard it is to reliably hit a direction on the pad without hitting the other sensors next to it. The oft-derided U.S. patent system might be partially responsible for this, as some of the good ideas (Nintendo's + pad, Sony's cross pad) were "taken". Still, there are plenty of PC pads that don't have this issue to the same degree.
      • Ditto the D-Pad of the Microsoft SideWinder Freestyle Pro gamepad.
  • The problems with the interface aren't really that bad. Remember the LPT port? All the messed-up ISA/EISA variants? The interface design is really only bad compared with a) modern designs and b) the design of the floppy disc itself, which is often hailed by those in the tiny field as the best connection system ever, for appearing square (making it look good) but having only one way to insert it, period (in contrast to even USB, which has two ways to be forced in, only one of which works).
    • The LPT port is a pretty cool interface, if you like to throw together homemade devices that don't need a controller IC.
  • After insulting the childishness of the GBA through PR, Nokia created the complete joke of a design that was the original N-Gage. As a phone, the only way you could speak or hear anything effectively is if the user held the thin side of the unit to his/her ear (earning it the derisive nickname "taco phone" and the infamous "sidetalking"). From a gaming point of view, it was even worse, as the screen was oriented vertically instead of horizontally like most handhelds, limiting the player's ability to see the game field (very problematic with games like the N-Gage port of Sonic Advance). Worst of all, however, is the fact that in order to change games, one had to remove the casing and the battery every single time.
    • During the development of the N-Gage, Nokia held a conference where they invited representatives from various game developers to test a prototype model of the N-Gage and give feedback. After receiving numerous suggestions on how to improve the N-Gage, Nokia promptly ignored most of them on the grounds that they were making the machines through the same assembly lines as their regular phones and they were not going to alter that.
  • As far as bad console (or rather, console add-on) design goes, the all-time biggest example is probably the Atari Jaguar CD. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which many people have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Sega Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the disc motor to break apart internally from its fruitless attempts to spin the disc. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective.

    Of note, it was not rare for the device to come fresh from the box in such a state of disrepair that highly trained specialists couldn't get it working – for example, it could be soldered directly to the cartridge port and still display a connection error. This, by the way, is exactly what happened when James Rolfe tried to review the system.
  • The Sega Saturn is, despite its admitted strong points on the player end, seen as one of the worst major consoles internally. It was originally intended to be the best 2D gaming system out there (which it was), so its design was basically just a 32X with higher clockspeeds, more memory, and CD storage. However, partway through development, Sega learned of Sony's and Nintendo's upcoming systems (the PlayStation and Nintendo64 respectively) which are both designed with 3D games in mind, and realized the market – especially in their North America stronghold – was about to shift under their feet; they wouldn't have a prayer of competing. So, in an effort to try to bring more and more power to the console, Sega added an extra CPU and GPU to the system. Sounds great! …Until you consider that there were also six other processors that couldn't interface too well. This also made the motherboard prohibitively complex, being the most expensive console at the time. And lastly, the GPU's basic primitive had four sides (the industry standard was three sides). This made multiplatform games tricky to work with on the Saturn.
    • Hilariously, the Saturn is the first major example of a dual-core processor. It took several more years for the chip manufacturers to be able to take full advantage of what Sega had created by accident.
  • They all have nothing on the infamous A20 line. Due to the quirk in how its addressing system workednote , Intel's 8088/86 CPUs could theoretically address slightly more than their advertised 1 MB. But because they physically still had only 20 address pins, the resulting address just wrapped over, so the last 64K of memory actually were the same as first. Some early programmersnote  were, unsurprisingly, stupid enough to use this almost-not-a-bug as a feature. So, when the 24-bit 80286 rolled in, a problem arose – nothing wrapped any more. In a truly stellar example of a "compatibility is God" thinking, IBM engineers couldn't think up anything better than to simply block the offending 21st pin (the aforementioned A20 line) on the motherboard side, making the 286 unable to use a solid chunk of its memory above 1 meg until this switch was turned on. This might have been an acceptable (if very clumsy) solution had IBM defaulted to having the A20 line enabled and provided an option to disable it when needed, but instead they decided to have it always turned off unless the OS specifically enables it. By the 386 times, no sane programmer used that "wrapping up" trick any more, but turning the A20 line on is still among the very first things any PC OS has to do. It wasn't until Intel introduced the Core i7 in 2008 that they finally decided "screw it", and locked the A20 line into being permanently enabled.
  • AMD haven't been all good ideas, either:
    • Their wildly successful Athlon Thunderbird ran at high speeds and for a while obliterated everything else on the market, but it was also the hottest CPU ever made up until that point. This wouldn't be so bad in and of itself – even hotter CPUs were made by both AMD and Intel in later years – but the Thunderbird was special in that it had no heat-management features whatsoever. If you ran one without the heatsink – or, more plausibly, if the heavy chunk of aluminium sitting on the processor broke the mounting clips through its sheer weight and dropped to the floor of the case – the processor would insta-barbecue itself.
    • In late 2006 it was obvious that Intel were determined to pay AMD back for the years of ass-kickings it had endured at the hands of the Athlon 64, by releasing the Core 2 Quad only five months after the Core 2 Duo had turned the performance tables. The Phenom was still some ways off, so AMD responded with the Quad FX, a consumer-oriented dual-processor platform that could mount two dual-core chips (branded as Athlon 64s, but actually rebadged Opteron server chips). While re-purposing Opterons for desktop use was something that had worked magnificently three years prior, this time it became obvious that AMD hadn't thought things through. Not only was this set-up more expensive than a Core 2 Quad (the CPUs and motherboard worked out to about the same price, but you needed twice the memory modules, a more powerful PSU and a copy of Windows XP Professional), but it generally wasn't any faster, and in anything that didn't use all four cores actually tended to be far slower, as Windows XP had no idea how to deal with the two memory pools created by the dual-CPU set-up (Vista was a lot more adept in that department, but had its own set of problems).
    • Amazingly enough, things went From Bad to Worse when the Phenom eventually did arrive on the scene. In addition to being clocked far too slow to compete with the Core 2 Quad — which wasn't really due to any particular design flaw, other than its native quad-core design being a little Awesome, but Impractical — it turned out that there was a major problem with the chip's translation lookaside buffer (TLB), which could lead to crashes and/or data corruption in certain rare circumstances. Instead of either initiating a full recall or banking on the fact that 98% of users would never encounter this bug, AMD chose a somewhat odd third option and issued a BIOS patch that disabled the TLB altogether, crippling the chip's performance. They soon released Phenoms that didn't have the problem at all, but any slim hope of it succeeding went up in smoke after this fiasco.
      • Interesting enough, they went better with Phenom II, which improved dramatically their performance, going near Intel once again and even more, their 6 core chips were good enough to get some buyers.
    • AMD's Bulldozer was something that may make people wonder why they went this route. On the surface, AMD made two integer cores share a floating point unit. This makes some sense, most operations are integer based. Except those cores share an instruction decoder and scheduler, effectively making a single core with two disjointed pools of execution units. Also each integer core was weaker than the Phenom II's core. To make matters worse, they also adopted a deep pipeline and high clock frequencies. If anyone paid attention to processor history, those two reasons were the root cause in why the Pentium 4 failed.
  • VIA was not saved too. In the 2000's they were the 3rd X86 processor maker (and still they are), however their marketshare went from 10% to less than 1%!!! Also the reasons were that they didn't improved anything since the Pentium 4 era and was still using that tech until now.
  • The iRex Digital Reader 1000 had a truly beautiful full-A4 eInk display... but was otherwise completely useless as a digital reader. It could take more than a minute to boot up, between 5 and 30 seconds to change between pages of a PDF document, and could damage the memory card inserted into it. Also, if the battery drained all the way to nothing, starting to charge it again would cause such a current draw that it would fail to charge (and cause power faults) on any device other than a direct USB-to-mains connector, which was not supplied with the hardware.
  • The Coleco Adam. First of all, its power supply went through the printer, meaning it would brick if the printer stopped working. It didn't even have an OS installed – you had to install it with a cassette. This wouldn't be so bad if the computer didn't generate a powerful magnetic field the moment it was switched on. To top it off, owners were advised to start the computers with the tape still in the computer. Little effort was made to fix this beyond a disclaimer on the package itself.
  • The UEFI firmware on some Samsung laptops choke and become bricked if the operating system writes too many variables to the firmware's memory. It was soon discovered that "too many" was about 50%.
  • On top of the software problems mentioned in Games, the OUYA has a pretty bad design choice – the fan used to prevent overheating isn't pointed at either of the two vents. Never mind that the console uses a mobile processor, which doesn't need a fan. In theory, the fan would allow the processor to run at a higher sustained speed. In practice, it blows hot air directly against the wall of the casing, causing frequent issues due to overheating.
  • Motorola is one of the most ubiquitous producers of commercial two-way radios, so you'd think they'd ironed out issues. Nope, there's a bunch.
    • The MTX 9000 line (the "brick" radios) were generally made of Nokiamantium, but they had a huge flaw in the battery clips. The battery was held at the bottom by two flimsy plastic claws and the clips at the top were just slightly thicker than cellophane meaning that the batteries quickly became impossible to hold in without buying a very tight-fitting holster or wrapping rubber bands around it.
    • The software to program virtually any Motorola radio, even newer ones, is absolutely ancient. You can only connect via serial port. An actual serial port, USB to serial adapter generally won't work. And the system it's running on has to be basically stone age (Pentium Is are generally too fast), meaning that in most radio shops, there's a 486 in the corner just for programming them. Even the new XPR line can't generally be programmed with a computer made after 2005 or so.
      • If you can't find a 486 computer, there's a build of DOSBox that floats around ham circles with beefed up code to slow down the environment even more than is possible by default. (MTXs were very popular for 900MHz work because (aside from the battery issue) were tough and cheap to get because of all the public agencies and companies that sold them off in bulk)
  • VESA Local Bus. Cards were very long and hard to insert due they needed two ports: the standard ISA and an additional 32 bit bus hardwired to the 486 processor bus. The latter caused a huge instability and incompatibility. Things could got worse if there were a non graphics card expasion alongside it, usually IO ports. And this usually resulted crashes during games used SVGA graphics while accessing the hard drive. The cards needed to be designed well because the multiple clock frequency. These things resulted the 486 bus dependent VLB replaced with the PCI in Pentium class computers and even on newer 486 boards.
  • The Radio Shack TRS-80 (model 1) had its share of hardware defects:
    • The timing loop constant in the keyboard debounce routine was too small. This caused the keys to "bounce" - one keypress would sometimes result in 2 of that character being input.
    • The timing loop constant in the tape input routine was wrong. This made the volume setting on the cassette player extremely critical. This problem could somewhat be alleviated by placing an AM radio next to the computer and tuning it to the RFI generated by the tape input circuit, then adjusting the volume control on the tape player for the purest tone from the radio. Radio Shack eventually offered a free hardware modification that conditioned the signal from the tape player to make the volume setting less critical.
    • Instead of using an off-the-shelf Character Generator chip in the video circuit, RS had a custom CG chip programmed, with arrow characters instead of 4 of the least-used ASCII characters. But they made a mistake and positioned the lowercase "a" at the top of the character cell instead of at the baseline. Instead of wasting the initial production run of chips and ordering new chips, they eliminated one of the video-memory chips, added some gates to "fold" the lowercase characters into the uppercase characters, and modified the video driver software to accommodate this. Hobbyists with electronics skills were able to add the missing video memory chip, disconnect the added gates and patch the video driver software to properly display lowercase, albeit with "flying a's". The software patch would have to be reloaded every time the computer was booted. Radio Shack eventually came out with an "official" version of this mod which included a correctly-programmed CG chip.
    • The biggest flaw in the Model 1 was the lack of gold plating on the edge connector for the Expansion Interface. Two-thirds of the RAM in a fully-expanded TRS-80 was in the EI, and the bare copper contact fingers on the edge connector oxidized readily, resulting in very unreliable operation. It was often necessary to shut off the computer and clean the contacts several times per day. At least one vendor offered a "gold plug", which was a properly gold-plated edge connector which could be soldered onto the original edge connector, eliminating this problem.
    • In addition, the motherboard-to-EI cable was very sensitive to noise and signal degradation, which also tended to cause random crashes and reboots. RS attempted to fix this by using a "buffered cable" to connect the EI to the computer. It helped some, but not enough. They then tried separating the 3 critical memory-timing signals into a separate shielded cable (the "DIN plug" cable), but this still wasn't enough. They eventually redesigned the EI circuit board to use only 1 memory timing signal, but that caused problems for some of the unofficial "speed-up" mods that were becoming popular with hobbyists.
    • The Floppy Disk Controller chip used in the Model I EI could only read and write Single Density disks. Soon afterwards a new FDC chip became available which could read and write Double Density (a more efficient encoding method that packs 80% more data in the same space). The new FDC chip was almost pin-compatible with the old one, but not quite. One of the values written to the header of each data sector on the disk was a 2-bit value called the "Data Address Mark". 2 pins on the single-density FDC chip were used to specify this value. As there were no spare pins available on the DD FDC chip, one of these pins was reassigned as the "density select" pin. Therefore the DD FDC chip could only write the first 2 of the 4 possible DAM values. Guess which value TRS-DOS used? Several companies (starting with Percom, and eventually even Radio Shack themselves) offered "doubler" adapters - a small circuit board containing sockets for both FDC chips! To install the doubler, you had to remove the SD FDC chip from the EI, plug it into the empty socket on the doubler PCB, then plug the doubler into the vacated FDC socket in the EI. Logic on the doubler board would select the correct FDC chip.
  • The TRS-80 model II (a "business" computer using 8-inch floppy disks) had a built-in video monitor with a truly fatal flaw. The sweep signals used to deflect the electron beam in the CRT were generated from a programmable timer chip. When the computer booted, one of the first things it would do is write the correct timer constants to the CRTC chip. However, an errant program could accidentally write any other values to the CRTC chip, which would throw the sweep frequencies way off. The horizontal sweep circuit was designed to operate properly at just one frequency and will "send up smoke signals" if operated at a frequency significantly different than what it was designed to operate at. If your screen goes blank and you hear a loud high-pitched whine from the computer, shut the power off immediately, as it only takes a few seconds to destroy some rather expensive components in the monitor.
  • Nvidia's early history is interesting… in the same way a train wreck is. There's a reason why their first 3D chipset, the NV1, barely gets a passing note in the official company history page. See, the NV1 was a weird chip which they put on an oddball – even for the times – hybrid card meant to let you play specially ported Sega Saturn games on the PC. The chip's weirdness came from its quadratic primitives, when everybody else used, and has ever used, triangles. Developing for a quad-supporting chip was complicated, and porting previously existent triangle games to quads wasn't all that better either, so the NV1 was wildly unpopular from the start. Additionally, the hybrid cards integrated other features (such as MIDI playback) that weren't needed and increased cost and complexity. When Microsoft came out with Direct3D it effectively killed the NV1, as it was all but incompatible with it. Nvidia stubbornly went on to design the NV2, still with quad mapping, intending to put it in the Dreamcast – but then Sega saw the writing on the wall, told Nvidia "thanks but no thanks" and used a Nec PowerVR instead. Nvidia finally saw the light, dropped quads altogether and came out with the Riva 128, which was a decent hit and propelled them onto the scene – probably with great sighs of relief from the shareholders.
  • Anytime your hardware has a firmware update, but the update utility is a Windows Executable and the system is running another OS such as Macintosh or any flavor of Linux. To do the firmware flash the right way, you'll probably have to remove the hardware, hook it up to a Windows machine and then run from there. To be fair, it is a reasonable guess that the system will be running Windows, but issues like this can fan the flames of anti-Microsoft sentiment among alternative OS markets (like Debian-Linux derivatives). In the 1990's, this problem was usually averted by having the firmware utility create a bootable DOS-like system with the flashing tool on the disc.
  • Improvement in low-power processor manufacture by Intel – namely the Bay Trail-T system-on-a-chip architecture – have now made it possible to manufacture a honest-to-goodness x86 computer running full-blown Windows 8.1 and with moderate gaming capabilities in a box the size of a book. Cue a whole lot of confounded Chinese manufacturers using the same design standards they used on Arm systems-on-a-chip to build Intel ones, sometimes using cases with nary a single air hole and often emphasizing the lack of need for bulky heatsinks and noisy fans. Problem: You do actually need heat sinking on Intel SoCs, especially if you're going to pump them for all the performance they're capable of (which you will, if you use them for gaming or high-res video playback). Without a finned heatsink and/or fan moving air around, they'll just throttle down to crawling speed and frustrate the users.
  • Back in the early days of 3D Graphics cards, when they were called 3D Accelerators, and even 3Dfx hadn't found their stride, there was the S3 Virge. The card had good 2D performance, but such a weak 3D chip that at least one reviewer called it, with good reason, the world's first 3D Decelerator. That epithet is pretty much Exactly What It Says on the Tin, as 3D games performed worse on P Cs with an S3 Virge installed than they did in software mode, i.e. with no 3D acceleration at all.

    Mass-Storage Mishaps 
Watch out, these storage techs may be harmful to your data, a pain to use, or drain your money due lack of adoption.
  • The Commodore 64, one of the most popular computers of all time, wasn't without its share of problems. Perhaps the most widely known is its extreme slowness at loading programs. This couldn't really be helped with a storage medium like tape, which remained slow even after various clever solutions to speed it up, but floppy disks really ought to have been faster. What happened was that Commodore had devised a hardware-accelerated system for transferring data that worked fairly well, but then also found a hardware bug in the input/output chip that made it work not at all. Replacing the buggy chips was economically unfeasible, so the whole thing was revised to work entirely in software. This slowed down drive access immensely, and caused the birth of a cottage industry for speeder carts, replacement ROM chips, and fastloaders, most of which sped things up at least 5 times as fast. Since the drive itself had a CPU and some RAM to spare, it was programmable, so people could improve on things, and did. Eventually one was developed that stored data in a non standard format, and loaded 25 times as fast as normal and then was cloned into freezer carts.
  • Why, after the introduction of integrated controllers into every other storage device, does the floppy have to be controlled by the motherboard? Sure, it makes the floppy drive simpler to manufacture, but you're left with a motherboard that only knows how to operate a spinning mass of magnetic material. Try making a floppy "emulator" that actually uses flash storage, and you'll run into this nigh-impassible obstacle.
    • The floppy drive interface design made sense when it was designed (first PC hard drives also used a similar interface) and was later kept for backward compatibility. However, a lot of motherboards also support IDE floppy drives (There may not have been any actual IDE floppy drives, but a LS120 drive identifies itself as floppy drive and can read regular 3.5" floppy disks), a SCSI or USB device can also identify as floppy drive. On the other hand, the floppy interface is quite simple if you want to make your own floppy drive emulator.
  • Sony's HiFD "floptical" drive system. The Zip Drive and the LS-120 Superdrive had already attempted to displace the aging 1.44MB floppy, but many predicted that the HiFD would be the real deal. At least until it turned out that Sony had utterly screwed up the HiFD's write head design, which caused performance degradation, hard crashes, data corruption, and all sorts of other nasty problems. They took the drive off the market, then bought it back a year later... in a new 200MB version that was totally incompatible with disks used by the original 150MB version (and 720KB floppies as well), since the original HiFD design was so badly messed up that they couldn't maintain compatibility and make the succeeding version actually work. Sony has made a lot of weird, proprietary formats that have failed to take off for whatever reason, but the HiFD has to go down as the worst of the lot.
  • The IBM Deskstar 75GXP, nicknamed the Death Star. While it was a large drive by year-2000 standards, it had a disturbing habit of suddenly failing, taking your data with it. The magnetic coating was of subpar reliability, and came loose easily, causing head crashing that easily strips the magnetic layer off clean. One user with a RAID server setup reported to their RAID controller manufacturer; supposedly, this user was replacing their IBM deskstars at a rate of 600-800 drives per day. There have been many hard drives that have been criticized for various reasons, but the "Death Star" was something truly spectacular for all the wrong reasons.

    There is anecdotal evidence that IBM was even engaging in deception, knowingly selling faulty products, and then spewing out rhetoric about the industry-standard failure rates of hard drives. This denial strategy started a chain reaction that led to a demise in customer confidence. Class action lawsuits helped convince IBM to sell their hard drive division to Hitachi in 2002. (See "Top 25 Worst Tech Products of All Time" for this and more.)
  • The Iomega Zip Disk was a big success undeniably, but user confidence in the drives' reliability was terrorized by the "Click-of-death". Though tens-of-millions of the drives were sold, there were thousands of drive that would suffer misalignment and damage the medium injected into the drive. This would not be horrible by itself necessarily, but Iomega made a big mistake, downplaying the users who complained about drive failures and failing to be sensitive about their lost data.

    The Zip's worst problem wasn't even the fact that it could fail and potentially ruin a disk, but that such a ruined disk would go on to ruin whatever drive it was then inserted into. Which would then ruin more disks, which would ruin more drives, et cetera. Effectively a sort of hardware virus, it turned one of the best selling points of the platform (inter-drive compatibility) into its worst point of failure.

    After a class-action lawsuit 1998, Iomega issued rebates in 2001 for future products. It was too little, too late, and CD-R disks were now more popular for mass storage and perceived as more reliable.The New Zealand site for PC World has the original article still available.
    • Iomega's previous magnetic storage product, Bernoulli Box, was designed to avert this disaster by using a specific law of physics that makes it physically impossible for the drive head to make contact with the medium. When Iomega designed the Zip disk specification, the Bernoulli effect was overlooked, likely to save costs. Yes, Iomega already had a disk format that was designed to prevent the failures the Zip drives suffered.
  • Maxtor, now defunct, once sold a line of external hard drives under the One Touch label. However, the USB 2.0 interface would often malfunction and corrupt the filesystem on drive, rendering the data hard to recover. You were better off removing the drive enclosure and installing the disk on a spare SATA connection on a motherboard. Not surprisingly, Maxtor was having financial troubles already, before Seagate acquired them.
  • The 3M Superdisk and its proprietary 120MB "floptical" media were intended as direct competition to the Iomega Zip, but in order to penetrate a market that Iomega owned pretty tightly the Superdisk needed a special feature to push it ahead. That feature was the possibility to write up to 32 megabytes on a bog-standard 1.44MB floppy, using lasers for alignment of the heads. Back then 32MB was significant storage, and people really liked the idea of recycling existing floppy stock - of which everybody had large amounts - into high-capacity media. The feature might just have given the Superdisk the edge it needed; unfortunately what wasn't immediately clear, nor explicitly stated, was that the drive was only able to do random writes on its specific 120MB disks. It could indeed write 32MB on floppies, but only if you rewrote all the data every time a change, no matter how small, was made – basically like a CD-RW disk with no packet-writing system. This took a relatively long time, and transformed the feature into a gimmick. Disappointment ensued, and the format didn't even dent Iomega's empire before disappearing.
  • The Caleb UHD-144 was an attempt to gain a foothold in the floppy-replacement market. Unfortunately, it was ill-timed, the company not taking a hint from the failures of Sony & 3M, so there was no chance to see the product in action. The Zip-250 & inexpensive cd-r media caused the technology to be dead on arrival; A tragic example of a "good" idea rushed to market without checking for what the competition has to offer. (The Zip-250 itself was quickly marginalized by cost-effective CD-R discs that were designed to be read in optical drives already present in numerous computers.)
  • Some DVD Video players, especially some off-brand models seem to occasionally decide that the disc that you have inserted is not valid. The user ejects the disc and then injects it again and hopefully the DVD player decides to cooperate. This can be a headache if the machine is finicky about disc defects due to copy protection, or can't deal with your brand of DVD -/+ recordable disc that you use for your custom films. Bonus points if you have to crack a copy-protected disc to burn it onto a blank DVD because you can't watch the master copy. The inverse situation is also possible, where you have a DVD player made by a "reputable" brand, and even that can't allow you to watch the locked-down DVD you just spent money for.
    • Some DVD players are overly cautious about the discs it's willing to play because of regional lockout. Live in Australia and have a legally purchased Region 4 DVD? Turns out it was an NTSC disc, and your DVD player is only willing to play PAL discs. Oops.
  • After Solid-State Drives started taking over from mechanical hard drives as the storage device of choice for high-end users, it quickly became obvious that the transfer speeds would soon be bottlenecked by the speed of the Serial ATA standard, and that PCI Express was the obvious solution, albeit using it in the form of full-sized cards wasn't exactly optimal. The chipset industry's answer was SATA Express, a bizarre and horribly-designed solution which required manufacturers to synchronise data transfers over two lanes of PCI Express and two SATA ports, standards with completely different ways of working. Just to make it even worse, the cable for this standard was an ugly mess consisting of four separate wires (two SATA, one PCI-E, and one SATA power connector that just sort of hung off the end of the cable). The end result was one of the most resounding failures of an industry standard in computing history, as a grand total of zero storage products made use of it, with SSD manufacturers instead flocking to the SFF-8639 (later renamed U.2) connector, essentially just four PCI-E lanes crammed into a simple cable.

    Miscellaneous Examples 
Needless to say, there are more bad coders out there than good ones.
  • Actual code that causes these observed effects is a weekly feature at thedailywtf.com.
    • Also from the programming side, libraries developed by the Department of Redundancy Department where you have to lapse into Pokémon Speak to write any meaningful code. For instance, take this line from Debian's version of awesome's rc.lua:
      { "Debian", debian.menu.Debian_menu.Debian }
      • To be clear, "menu" is the only member of "debian", and "Debian_menu" is the only member of "debian.menu".
    • Java should have filled the niche of web-based games that Flash mostly owns... except that early versions of Java were so slow and so unnatural looking that Flash actually looked good in comparison. By the time they fixed it, Flash had become the de-facto standard for this kind of thing, much to the chagrin of just about everyone except Adobe.
  • The Computer Stupidities section of Rinkworks.com has its own section for programming.
  • Adding another Sony example to the lot already present on this page: when they first started out with their portable music players, Sony didn't support the MP3 standard due to their historical unwillingness to support anything that could encourage piracy of any kind. Their players instead supported Atrac3, a proprietary Sony audio format. Being (somewhat surprisingly) smart enough to figure out that users would want to listen to their MP3 music, Sony sold the players with SonicStage, an upload program capable of converting MP3 files to Atrac. SonicStage promptly proceeded to annoy a whole lot of people: buggy and prone to crashing, some computers couldn't run it at all (for reasons unknown), prompting many to return the players and switch brands.
    • Made worse by the fact that aside from the problems resulting from SonicStage, Atrac3 has better sound quality and a much smaller file size than the equivalent MP3. With the death of SonicStage due to the aforementioned problems and issues related to its closed source code, it's Lost Forever.
    • Creative did the same with their first hard-disk players: before they started supporting the MTP format (widely supported by many music managers), the only way you could upload stuff to them was by using the godawful PlayCenter program, later superseded by the even worse MediaSource. Many users preferred to keep PlayCenter: buggy as it was, at least it did its job sometimes. Both programs also attempted to set themselves as default players and music managers, further irritating users.
    • Ditto Microsoft with the Zune, at least initially. Read the Microsoft section for more.
  • Many streaming video websites have a ridiculously small buffer size; quite often only 3 to 4 seconds of video, and possibly even less at HD resolutions. In optimum conditions, it's not so problematic, but if the site is particularly busy or if your Internet connection isn't too fast, it can make the videos all but unwatchable. A part of this problem is actually politics of internet service providers. The content providers (YouTube and Netflix for example) and service providers are in heated debates about who should fork over dough for the upkeep of the network. When this fails to make any progress, certain aspects of how videos get to your home suffer. It's not that content providers suck, it's that they have to hop through more loops to get to you. You can read more about it at this article.
    • An additional problem caused by the introduction of ad breaks on certain longer videos, both for the commercial networks and independent web producers, is an annoying tendency for the players to just stop working after an ad break and never go back to the video originally being watched. Fortunately most sites are kind enough to remember where you were in the video and just pick up after the last commercial break, but several — which, annoyingly, includes Blip.tv — will force you to restart from the beginning. Which means sitting through some (if not all) of the ads again.
  • Computers don't truly have random number generation. Instead, they take a number (often something like the system time) as a "seed" and use that to generate a stream of random-looking numbers (it's called a "pseudo-random number generator", or PRNG) – and a given seed will always produce the same sequence each time it's used. This isn't Idiot Programming itself – the way computers work means that they can't actually generate proper random numbers without an outside source of randomness – but considering that this fact is mentioned in every basic computer science class worth its salt, any code that fails to take it into consideration is Idiot Programming.
    • For a specific example, at least one poker site was cracked this way. Someone wrote a program that, using the programmer's knowledge of the RNG and the player's own pocket cards, could tell the player everyone else's cards as well as the future cards that would come off the deck, completely breaking the game.
    • Another example was found in a keno machine which would always roll the same sequence of numbers after each reboot, since it used the same seed every time it booted up. This allowed one clever guy to win big three games in a row. Slot and keno machines nowadays prevent this by constantly advancing the seed even while it's not being played, by either incrementing the seed number or just rolling and throwing away the outcomesnote .
    • RANDU, the infamous random number generator included on IBM computers in the 60s. How bad is it? Aside from the fact that every number is odd (which is very very bad on its own yet easy to work around), any three adjacently-generated numbers can be mathematically related into a plane (which looks like 15 planes when plotted due to modulo arithmetic).
      • And because the computer it was included with, System/360 mainframe, is widely regarded as their greatest work and was the computer of The '60s and The '70s, the generator became so widespread that the traces of it periodically surface even now, almost fifty years past.
    • Sega's 1988 arcade version of Tetris uses the exact same RNG seed every time it boots up, resulting in the infamous "power-on pattern". This goes against a major concept of Tetris (randomized pieces) and allows the player to simply use the pattern to plot out their piece placements to max out the score counter in as few pieces as possible, rather than playing against an RNG seed they don't know and dealing with whatever it throws out. Of course, you need to have machine-resetting privileges or be playing on an emulator to take advantage of this.
      • To be fair, the original arcade stored the seed in NVRAM, so resetting wouldn't be enough to get the pattern back. You would need to enter setup and clear the NVRAM, which arcade players could not do. But, like nearly every other Sega arcade game back then, it got bootlegged a lot, and the bootleggers left off the NVRAM, meaning the players COULD reset the pattern after all. Additionally, it generated a 1000 piece looping sequence, allowing for playing forever if you figured out how to handle the wrap around, instead of simply advancing the seed every piece, which would have been easier. An alternate arcade version of Sega Tetris for different hardware DID generate pieces as needed instead of a looping sequence, but that hardware didn't have NVRAM, so it ALSO had a power-on pattern.
    • Generally speaking, using the RNG to advance the RNG makes it less random, not more, as multiple positions will end up having the same result. This mistake is common enough to have its own name — "RNG jitter". One of the programs guilty of it is NetHack, as constantly mentioned in this April Fools' Day TAS submission.
  • Earlier versions of Java would install updates not by patching the existing installation, but creating a completely new installation in a separate folder without removing the old one.
    • This could be justifiable in a sense — keeping the old versions serves as a crude form of backwards compatibility, ensuring that older code would be able to find the version of Java it was meant to use. If the install was small enough (not that Java is known for its brevity), it would be somewhat practical, though inelegant in the extreme and therefore a Bad Thing.
    • Even more scary is the constant discovery of security vulnerabilities, that allow malicious software to escape from the touted "sandbox mode" and harm Windows. Some viruses will search for broken versions of Java installed on your computer, in parallel to new ones, then proceed to use that to attempt administrator impersonation.
    • While we're on Java ... it was intended to be a platform-independent programming language, somewhere in-between a compiled and interpreted language. In practice, it worked okay on the processors used in Sun computers (mostly Motorola) and Intel-and-compatible processors (a market too large for Sun to ignore). On other architectures it tended to be horrendously inefficient and dog-slow.
    • Lately (Java 1.7 and higher) there's been a distressing tendency for each new Java "update" to be less efficient in memory usage, slower, and more draconian about its security policies, which the user cannot disable (there are various security options, but even the most permissive settings merely tones it down slightly). Instead of Security-Through-Obscurity, Java seems to be aiming for Security-Through-No-One-Can-Stand-To-Use-It-Anymore.
    • A rather... interesting occurrence happens when certain programs written for 32-bit Java get run on a system with the 64-bit Java Runtime Environment installed. Namely, they don't work. This is because the 32-bit has certain features which the 64-bit compiles into streamlined settings, but if those features are specifically called rather than used "as intended" it can cause anything from lag to memory leaks to random crashes or even just not starting at all. The work-around is to install both the 32-bit and 64-bit into different directories... But your system will only ever update one of them, so you now have to uninstall Java and reinstall it with every update you choose.
  • For graphing calculators, it often happens that there exist several different hardware for linking them to computers. It also happens that different linking software, from different authors, don't support the same hardware. But unlike Texas Instruments calculators, linking applications for Casio calculators all used different, incompatible file formats on the PC.
  • While Windows Vista (see below) did introduce a ton of problems, it also did something that revealed many a programmer's idiot programming choice: assuming that the user account always had administrative rights. In Windows Vista, Microsoft introduced UAC, which would only assign standard user rights to a program, even if the user was an administrator. This is sensible, as it limits the damage that the program can do if it goes rogue. Programs that needed administrator rights were detected based on the file name and an optional configuration file called a manifest. Of course, older software that needed administrator rights knew nothing of manifests, and would fail in unpredictable ways, usually spouting an error message that wouldn't make the actual problem obvious to the non-technical (or necessarily even to the technical) — although Windows did sometimes spout a dialogue box along the lines of "Whoops, this program looks like it needs admin rights, but it didn't ask for them and I didn't realize until just now, do you want me to make sure it runs as an admin in future?".
  • So the designers of the Soviet Phobos space probe left testing routines in the flight computer's ROM — fair enough, everyone does the same, because removing them means retesting and recertifying the whole computer, which generally would've be plainly impossible without said routines. But to design the probe's Operating System in such a way that a one-character typo in an incoming command would accidentally trigger a routine that turns off the attitude thrusters, making the spacecraft unable to point its solar panels at the Sun and recharge its batteries, effectively killing it, takes a special kind of failure.
  • Firefox 4's temporary file deletion algorithm was an unusual case, since it was actually pretty effective at deleting older files and freeing up large amounts of disk space. It suffered a major problem, though, in that it chewed up huge amounts of CPU power and maxed out the hard drive in the process, which could slow your entire system to a crawl. Worse still, there was no way of aborting the cleanup routine, and if you killed the Firefox process, it would just invoke the file cleaner again as soon as you restarted the browser. It wasn't until Firefox 5 that the file cleaner got fixed, using much less CPU power and still being fairly disk-intensive, but not to the same extent as previously.
    • A few months back, Firefox had an update that changed the system for syncing your passwords. This allowed recovery of accounts in case of losing your master decoder key, but had an outrageous problem: You could no long sync passwords with a master password set on your Firefox user, requiring the user to disable it and then check-off passwords in the sync list. Thankfully, Mozilla updated this problem, and all is well again.
  • Tech sites have noted a rather disturbing trend in how certain handheld devices handle firmware updates. The sane way to do such an update over the internet is to check for the existence of updated firmware, download it, erase the old firmware, and then load the updated version. Ideally there's also a backup firmware chip, or some other way of restoring the device if things go pear-shaped. Unfortunately, a lot of devices (especially cheaper ones) don't actually do that — instead, they check for a firmware update, and upon getting confirmation that that there is such an update, the device immediately wipes the old firmware, then downloads and installs the updated version. If anything goes significantly wrong during the download (i.e. loss of internet connection, loss of power or a software error), then the device will almost certainly be bricked. On top of that, most of the time there's no way to restore such a device to working order outside of replacing the motherboard, and only a 50/50 or so chance the manufacturer will replace it under warranty.
    • Curiously, most so-called "smart TVs" will do this — Samsung and Sony seem to be particularly bad about it.
  • A flight of ultra-high-tech F-22 Raptors suffered multiple computer failures and were practically crippled because their programming couldn't cope with the time zone change of crossing the International Date Line. Somehow, it never occurred to the designers that a fighter aircraft just might cross the International Date Line and forgot to program its systems to adjust for it — which is a standard part of the programming of modern cellphones. This oversight resulted in a temporary grounding of all Raptors for a time.
  • This is done In-Universe in Calvin and Hobbes: The Series. Calvin's time machine is set to power down when it's almost dead.
    • And it has to load when it refuels.
  • Many, many pieces of PC gaming hardware feature extremely fancy graphical interfaces and special effects for their drivers. Although they might make the hardware look attractive for the few minutes people will spend setting them up, this also has the effect of consuming extra system resources and thus interferes with the actual games the user might want to play.
    • The Razer Starcraft II series of hardware were perhaps the worst example of this. The Razer Spectre mouse, made for Starcraft II, had such an intrusive driver that a patch had to be issued for StarCraft II itself to prevent the slowdown it caused. The Razer Marauder keyboard, also made for Starcraft II, not only used the same intrusive driver but when Razer sponsored their own team for Starcraft II, the keyboards they issued them with were... Razer Blackwidows.
  • At some point, the client software for AOL was so intrusive that a page on horrible software (now part of the Permanent Red Link Club) said that you shouldn't download it, and that the only way to get rid of it was to use a Live CD with support for NTFS partitions.
  • UEFI in modern computers have become commonplace and it offers many benefits over the old BIOS such as support for better interfaces so configurations can be easily done and none of the old limitations from the 80s like being limited to 640KB of RAM. Hell, some computers don't even have so much as the old POST screen anymore. Thanks to the "compatibility support module" found in most UEFI firmware, it can still be used with older operating systems emulating a BIOS. But of course, UEFI mode causes issues once in a while due to shoddy coding, e.g., where filling the UEFI memory with too many variables (which could be triggered by simply running Linux) bricks your laptop.
    • One issue with UEFI being generally prettified in comparison to BIOS is that some manufacturers remove the old POST screen without replacing it with anything but a non-removable manufacturer logo. The average user is unlikely to mind, but it's a massive pain to any tech who used to rely on the POST messages to figure out the hardware, and potential early errors, without having to load the OS and from there a hardware-scanning application.
  • vBulletin 5 has this in spades. Thousands of bugs? Check. Bugs that let members do things like figure out what private sections exist, not search/view more pages/post if JavaScript is disabled? Check. Changing every aspect of the URL structure on upgrade in one fell swoop and completely mauling the site's search engine rankings? Check again. There's even a code review that breaks down the poorly programming line by line that's filled with examples of Idiot Programming and inconsistency.
    • vBulletin 4 wasn't that great either; the old developers fled the coop after they somehow thought selling out to Internet Brands was a good idea. All hell broke loose.
  • Zamfoo. This web hosting control panel software, as discussed at Webhosting Talk and Reddit is an absolute goldmine of examples of how NOT to program anything in existence. Some examples:
    • The 'easy upgrade' script has you enter your server root login information on the company's website, which is then transmitted as plain text through HTTP.
    • The code (which is supposedly encrypted) is devoid of any logic or decent coding sense in the slightest, doing things like enabling and then disabling "strict mode" to "fix" errors and and using only the most basic programming code possible.
    • The released updates were literally hacked in about five minutes flat, and didn't fix any of the issues properly.
    • Add "support" which amounts to threats and personal attacks, the use of nulled (stolen) software by the coder and the chance that the whole thing could well have unlicensed code from other software in it, and the whole thing is literally a disaster in every way possible.
  • Want to register on Finale PrintMusic's forums (or certain others, such as the Magic Set Editor ones)? We hope you don't plan on being away from your account for too long, because it will deactivate after a few months and you can't log back in. You can't simply create a new account, either, because the old information will still be in the system and it won't allow you to use the same information for another account. You can contact a forum administrator and have them reactivate the old one, but you need to view the admin's profile to gain their contact information...and you can only view user profiles if you're logged into an account. The Catch-22 should be obvious.
    • For Finale's programs themselves, in Finale Notepad 2003, there's no limit as to how high or low you can put a music note on the music sheet, but be warned: placing a note very high or very low on a music sheet can sometimes freeze the whole entire system, with the only solution being to force shut down your computer with the power button.
  • Oracle's VirtualBox (like many of Oracle's products) has gotten a lot of flack because of extremely poor programming. Version 4.3.14 is especially bad; despite the release notes saying it doesn't need to restart a computer running Windows after installing, it still does. After restarting, as with certain previous versions, guest operating systems won't be able to boot. Unlike previous versions, before saying something vague about why the virtual machine won't start, it gives the user another error message, which is written in hex or something. After attempting to repair the installation (which actually worked in previous versions) and restarting again, this time VirtualBox itself won't start, instead greeting the user with the same indecipherable error message.
  • In general, any and all interfaces made for specific gadgets and intended to look non-treatening to the end user. This is less of a problem today than it used to be, thanks to universal interfaces such as USB, MTP and various other platform-agnostic protocols that let you just plug in a device and see its contents, upload to it, or just have it work with little-to-no attention required. Back when this stuff wasn't so well-baked, though, it was common to receive a nice software package with, say, your camera, and you needed that to get the pictures - which might well be in some proprietary format, or even encrypted - out of the device's memory and into your computer. The software was almost always Windows-only, natch, and typically written by someone who would have lost a programming competition against a trained monkey. So you'd get horrors like this, which might - if you were particularly lucky - actually let you ignore all the family-friendly features (filters! colors! cropping tools! Instagram before Instagram existed!) and just dump the damn data on the hard drive without crashing. But it wasn't unusual to have to relaunch the application several times before it would just do its job.
  • Some questions to Facebook support ask why it's possible to drag and drop entries in the TV Shows and Movies categories, but not other categories like Apps and Games or Music. There is no official answer. In previous versions of the Likes pages, drag and drop was possible in all categories, but this has remained disabled for more than a year. Enabling this should take the web programmers no more than ten minutes.
  • Many websites are distressingly bad at password security, as Tom Scott explains. The worst offenders will simply email a password to you in cleartext on request, which implies that the website is not hashing and salting user passwords, allowing password thieves to crack the stupid passwords like "12345" to log in. Scott claims that handling passwords at all is Idiot Programming to begin with, and that programmers should use a third-party service like Facebook if possible, because security is so hard to get right.
  • A misconfiguration in Steam's caching on Christmas Day 2015 let logged-in users briefly see other people's accounts. Caching pages for logged-in users is another big security no-no.
  • GOG.com's "Galaxy" client has certain issues with download size (namely getting it wrong) and download speed (roughly comparable to having the code yelled at you over the phone and programming it yourself). For example, take the Diamond edition of Neverwinter Nights. This comes to about 2.5GB, a maximum of 3 if you count all the music and avatars and so on that come as freebies. Galaxy will report the game size as being 5GB. It will then take an obscenely long time to download that 5GB; on a connection where Origin can get a game of that size downloaded within an eight hour period, Galaxy will take twelve. Read that again: 12 hours to download all five gigabytes of a three-gigabyte game. Thankfully you can just download games directly from your account on the site and save time, frustration and bandwidth.

    Programming Languages - It's All Greek to Me 
It's times like this that make you wonder if you'd rather stick to programming a VCR than deal with source-code debugging.
  • In a typical program in modern C++, 40-50% of the syntax will consist of calls on templates designed to shore up the language with modern features. Although C++ proponents argue that the fact that it's possible to hack in such features at all, albeit with difficulty, is evidence of the language's expressiveness and power. What most people who have to use C++ will agree on, however, is that libraries that provide modern features in C++ like the Boost C++ Library (and even the C++ standard library, as of C++11), count as the other trope.
  • The phenomenon of "ceremony" where large amounts of boilerplate code must be written even for relatively simple functions. In Java, for example, a class holding two values that can be read or written requires at least 4 lines of which 2 are ceremony, or if good OOP practice is used, 10 lines of which 8 are ceremony. Some languages, such as Factor and Common Lisp, attempt to eliminate ceremony entirely, but are often complex to use and potentially hard to understand as a result.
  • Nullable types, the bane of many programmers. In older pointer- or object-based languages, any time a programmer declares that a function or procedure inputs a complex value, it can be passed a "null" value which represents an absence of any input and will bring the program to a crashing (literally) halt if anything is done with it. This means that the programmer must manually test every input or every subprogram to check if it is null. Some modern languages such as Swift and Oxygene are starting to rectify this, but it still exists firmly in Java, C++, and C# - the most commonly used languages (although some development systems are implementing design contracts allowing a program to guarantee that a value will not be null, and warning the programmer when an input value does not provide this guarantee).
  • Exceptions were originally a nifty way of allowing a function to report that something had gone wrong. Unfortunately, they have a major problem: if a function calls other functions that call other functions, it is possible that the top level caller has to handle so many potential exceptions that it cannot possibly deal with them all. This is a common cause of "an unknown error occurred" or "there was a problem" type error messages. Common Lisp attempted to work around this by offering restarts which allow the function to offer a method of fixing the problem as well as a notification that it happened, but these are complicated to use and often not that beneficial.
    • Worse, there are two common bad practices for "handling" exceptions. One is to "swallow" any exception that the section of the program doesn't know how to handle note . Another is to repackage the exception (generally in a common format used for the company in question) and to throw out the information in the original Exception. Both practices make it difficult for calling classes to properly handle errors due to lack of information.
  • The process of creating a GUI. On most operating systems, the programmer's method for creating a GUI is to build a big complicated data structure describing what's supposed to be in the window, then hand control over to the OS and wait for clicks to be reported. Very few languages have elegant ways of building structures of that type, leading to incredibly messy code. Some platforms tried to help by providing a way to create the data structures in question separately from the program… but eventually the program has to access them, and at least one such platform requires the programmer to write code to manually unpack the data structure again in order to get a reference.
  • Software bloat is major issue today where programs get loaded up with more and more features without regard to the difficulty of preventing security exploits with such a thick "book" of code.

    As an example, the Heartbleed exploit was an encryption flaw in a program called Open SSL allowing an attacker to send requests to read data that should not be read, allowing the theft of confidential information from websites. Theo de Raadt, founder and leader of the original project that Open SSL was based upon, criticized the team for writing their own memory management routines and bypassing the Open BSD exploit counter-measures already in place. The Open SSL project being severely underfunded with only two people to write, maintain, and test 500,000 lines of business-critical code was not helpful either. The Libre SSL project was founded partly to trim down the code bloat, and make the software suite easier to harden against exploits.

    Threat Prevention Prevention Software 
It's never a good thing when your antivirus software is being just as annoying as an actual virus infection.
  • Thousands of computers using McAfee Antivirus were brought down on April 21, 2010 by an update that misidentified an essential Windows file as a virus, causing computers to constantly reboot. Barring exploits your average user wouldn't be familiar with in the slightest, users would have to go to an unaffected computer to get the fix and install it manually, as they couldn't go online on the infected PC.
    • McAfee's strength is that it blocks everything that might be a threat...which is also its main weakness. If you wish to use a program that it considers a threat (and as of this writing, it considers Dhux's Scar, among other things, to be such a program), you cannot get it to grant an exception. You're supposed to send McAfee's developers an email telling them it's a false alarm. If they don't respond, you need to disable McAfee every time you want to use the program.
    • On many computers, McAfee will make the CD drive stop working. And for a while, McAfee would often be stealthily installed by default during installations of something completely unrelated (and still is—looking at you, Adobe Flash Player).
    • McAfee is also easily thwarted by memory-resident viruses, the DOS version of ViruScan fails to detect the AntiCMOS virusnote  back in the days of MS-DOS if the system was booted from an already infected disk in the first place, even if it has an up-to-date signature file. Comparatively, other antiviruses would have detected the virus in memory, froze the PC, and instructed the user to reboot and run said antivirus from a "rescue" floppy to proceed.
  • On April 15th, 2013, Malwarebytes had a catastrophic false positive error, which caused it to mark every DLL, media file, and EXE as a trojan downloader and quarantine it. This ate up many users' hard drives, and rendered hundreds of machines inoperable, causing a large number of files to be Lost Forever.
  • Norton's notorious for this sort of thing:
    • Norton Internet Security blocks any and all images with certain dimensions, specifically those that are commonly used for advertisements. Problem is, at least one of the sizes is also commonly used by sites for non-ad purposes. In older versions, this could not be turned off without disabling all filtering completely.
    • Some older versions of Norton products, particularly Norton SystemWorks and Norton Internet Security, cannot be uninstalled without risking damage to the computer – PCs would wind up crashed, bricked or with corrupted files on the hard drive (including Windows Registry). Symantec had to create a special program for the sole purpose of safely and cleanly uninstalling Norton products, dubbed the Norton Removal Tool.
    • It is not unheard of for Norton Antivirus to declare itself a virus, and attempt to delete itself.
    • Norton 360 in particular…
      • …will block and delete any less-than-common executable run by the user. This includes coding written by the user themselves.
      • …deletes DLL files at random. The screen does not let you override this. The "Learn More" button directs to a Japanese version of the Norton product page.
      • …disables (as of October 2012) ALL network access, including offline, even when the firewall is listed as disabled in the options. The only way to address this is by removing Norton, and this must be done with the Norton Removal Tool in order to reverse the damage done to networking components.
      • …its firewall disables Firefox. For no reasons at all.
      • …increases the time needed to boot Windows by more than 500%.
    • Norton has also fallen prey to a host of other problems, such as a rather frivolous firewall and bad updates that at best gave BSO Ds when simply inserting an USB key and at worst forced users to perform a system restore.
    • Norton Antivirus's uninstaller also often accidentally deletes DLL files that are used by other software and drivers. One particular uninstall instance caused a system to BSOD and lose the ability to play sound upon reboot because said DLL was also being used by the Creative SoundBlaster Live! Drivers. One had to reinstall said drivers to fix the issue and restore the ability to play sound to the computer.
  • Symantec Antivirus tends to interfere with right-click menus and, more glaringly, cause BSODs within 5 minutes of starting up your computer more often than it actually blocks threats to the operating system. One could say that Symantec is a threat to your operating system.
  • Sophos Antivirus, some time in late 2012, released a virus database update which blocked files with certain extensions, but only when they were auto-downloaded by programs without user prompts. This was supposedly in response to a series of malware programs that would do just that in an attempt to remotely wrest control of Windows computers from their owners. However, one of the files that met the above criteria was the antivirus' auto-updater. Any system with that update would grind to a halt during startup. The patch (which wasn't released for nearly a month) had to be downloaded from a disk or drive while the computer was in safe mode, and in office networks, it could only be done by somebody with top-tier administration privileges. This was because the software, once disabled, would automatically reactivate (which isn't a bad feature on its own, but here, it exacerbated existing problems).
  • Sadly, AVG is getting in on the same problems as Norton.
    • As discussed above, the AVG Toolbar slows down your browser by absurd amounts. Its core process vprot.exe also eats up a tremendous amount of CPU time (on laptops it will often hog a whole core) even when the browser is idle or occasionally not open at all.
    • Not bad enough? In January 2012, the toolbar was bugged to continuously add onto its log file. This would quickly reach upwards of 40 gigabytes in length of nothing but acknowledging calls to arcane functions. Any user who's been using AVG around that era is advised to check WINDOWS/Temp/toolbar_log.txt to see if the bloated file is still there, because the toolbar prevents most disk cleanup tools from deleting it.
    • Or how about the 2010 update that rendered systems unbootable (and thus inoperable) by mistaking a critical file in 64-bit versions of Windows 7 for malicious coding?
    • Like Norton, it's also starting to randomly identify almost any file it's never seen before as a Trojan Horse. This mostly includes executables you developed yourself, arbitrary setup.exe's in your Downloads folder, and occasionally key files of development tools such as the Irvine Assembly libraries.
    • In the early days of 64-bit consumer Windows, AVG would detect Windows XP Professional x64 Edition as Windows Server 2003 R2 (the former is based on the latter), and instead of installing, complains about the software not being licensed to run on Server OSes and pushes the user to buy AVG antiviruses for servers instead.
  • Avast!:
    • Avast! actually made a blunder back in 2012 which causes the antivirus to freeze Windows XP Professional x64 Edition machines, and only XP Professional x64 Edition machines. The 64-bit versions of Vista and 7 were unaffected. The biggest kick in the nuts is, to restore the machine to usable state, one has to reboot into safe mode, which renders the uninstaller unusable because safe mode also disables the Windows Installer services. One has to somehow get the Avast remover program from Avast's website and move it to said affected PC and run it in safe mode to be able to remove Avast and return the PC to a usable state.
    • Like AVG and Norton, Avast is now starting to block and quarantine programs that you've compiled yourself.
    • Avast! gets very defensive when you try to install the Razer Synapse mouse software for your Razer mice. True, you can simply drop the "shields" and the installation occurs without issue, but when ever Synapse retrieves an update, Avast! will block the connection again, assuming that your computer is under attack, and likely frightening a novice user senseless. Simply plugging a mouse, such as a Razer Deathadder, is enough to cause windows to start installing the Synapse software, and triggering a "malware" warning. Malware is getting more and more sophisticated, but needlessly frightening beginner users is a dubious way to create awareness of malware.
  • Panda Antivirus has joined the ranks of security software that flags itself as a threat, tries to intervene, and borks the system as a result.

Alternative Title(s): Genius Programming