->''"I love lag, especially when you have eight processors and ''no excuse''!"''
-->--'''''LetsPlay/KungFuJesus''''', ''[[LetsPlay/SonicTheHedgehog2006 Let's Play Sonic 2006]]''

You'd think it would be simple to program something to do something simple. Despite what some people may tell you, you're often right. In a perfect world, we wouldn't have programmers who constantly assumed that any simple task is a gargantuan effort that requires the importation of several processor-heavy, 100-megabyte libraries just to set up. In a perfect world, machines capable of performing 2 billion complex calculations per second wouldn't be brought to their knees by [[https://en.wikipedia.org/wiki/Spinlock spin-locks]] and [[https://en.wikipedia.org/wiki/Memory_leak memory leaks]]. And, in a perfect world, these programmers, and the manufacturers who made these devices, would all be bankrupt.

Alas, this is not a perfect world.

The other side of this is, of course, SugarWiki/GeniusProgramming.

For similar, usually gaming-specific issues, see also ArtificialStupidity and GameBreakingBug.

In general, a problem in a piece of software shouldn't be considered an example if it occurs in a pre-release (pre-alpha, alpha, or beta) build, or if there's no reason to believe that the software should be judged by professional standards (e.g. commercial software is fair game, but non-commercial software often isn't, unless it's intended to compete with commercial software).

There are also a few things that often aren't examples, even though they might look like they are -- in particular, software that appears to use a lot of memory or storage space on your machine. Your operating system might use a gigabyte of memory on your machine, but that's not because it actually needs it -- what's actually happening is that it's using extra memory in exchange for a speed boost. Even the most resource-hungry consumer [=OSes=] used today can run on dinosaurs (although finding such a rig may prove difficult).

Some galling examples can be found on Website/TheDailyWTF ([[http://thedailywtf.com/Default.aspx here]]), particularly the [[http://thedailywtf.com/Series/CodeSOD.aspx CodeSOD]] section.

! Brands With Their Own Folders

[[folder:Adobe – Its Name Is Mud, Literally.]]
Look on the bright side, Adobe software can be used to make satires of Adobe itself.
* UsefulNotes/AdobeFlash. You may notice it on this very site, taking up 100% of your CPU and 80 megabytes of your RAM to display a static image that would take up 12K as a JPG. Also seen on numerous video sites, as a player that drags brand new multicore, multigigahertz computers to their knees in order to jerkily fail playing h.264 video that would run silky smooth on Pentium [=IIs=] or [=G3s=] as unwrapped files. Thank heavens for [[https://addons.mozilla.org/en-US/firefox/addon/433/ Flash]][[https://chrome.google.com/webstore/detail/flashblock/gofhjkjmkpinhpoiabjplobcaignabnl block]]. And also praise the Builder that [=YouTube=] videos can be streamed into external video players, such as VLC or [=SMPlayer=] with its very own [=YouTube=] browser. It's not perfect, but it is great for bypassing the browser-bloat on single-core CPU's, or simply to save on CPU for multi-tasking. This is far more efficient as it also strongly uses video acceleration hardware AND only needs 1/4 to 1/2 of the CPU on a Pentium IV 2.8 [=GHz=] system. Also, memory use is [[SugarWiki/GeniusProgramming almost non-existent]].\\
Even simple programs, like [[VideoGame/TheGameOfLife Conway's Game of Life]], which can easily run at 70 frames per second on a 486SX when written in C will struggle to run at 5 frames per ''minute'' when the same code is used in Flash on a computer more than a hundred times faster. And sometimes, the already-poor performance of Flash is compounded by the often badly coded applications written for it. To give an example, Creator/TheBBC embeds audio and/or video files in pretty much every article on the BBC News website. Unfortunately, the initial version of the Flash app they used to do this was so badly designed that any system with a processor below a Core i7 was pretty much guaranteed to be utterly brought to its knees for several minutes at a time while the player loaded. It took months for the app's performance problem to be fixed.\\
A couple versions ago, the Windows Flash installer would sometimes report insufficient disk space even when there was no such problem. The reason? The installer would check drive C for space, regardless of the actual destination drive, and even if C wasn't assigned to a hard drive at all. Compounding all these problems is the fact that Adobe appears to be deliberately crippling Flash in its capacity to perform its original purpose -- vector-based animation -- to try and get people to use it for what ''they'' want, which seems to be websites (hands up, everyone who thinks this sounds reasonable. Anyone? No one? Good).\\
Flash has issues on the security front as well. For example, in early 2015, over a two week period, Flash had ''three'' zero day exploits appear and attack unsuspecting users. As each one was patched, the next one appeared to make people's lives more miserable. There is a movement that Adobe is passively supporting called "Occupy Flash" that is pleading with users to uninstall the plug-in; with the rise of [=HTML5=] content, this is quite feasible for certain people.\\
The problem has gotten so bad that Google has been using an in-house reimplementation called Pepper Flash for years in the desktop versions of Google Chrome. Since Google is begrudgingly maintaining Pepper Flash for now (though they'd rather we used [=HTML5=] and [=WebGL=]), it's considered the definitive (!) version of Flash, especially on desktop Linux (which Adobe stopped supporting after Flash 11).\\
In July 14, 2015, Mozilla announced that ''all'' versions of Flash were now blocked by default in Firefox, citing Adobe's slow response to patching publicly available security exploits, to the delight of tech people the world over. However, Adobe soon issued patches, and Flash was unblocked -- though the general feeling is that this incident added another nail to a coffin that's been a '''long''' time coming. Website/YouTube has "given the finger" to Flash as well, and currently uses its own [=HTML5=] player by default. The memory consumption is reasonable for today's budget-computers (Budget Pentium, and AMD APU systems for example) considering it's running the full-featured [=YouTube=] site. Also, it, unlike the Flash player, supports 60 frames per second as well as playback speed options. A good omen, perhaps.\\
Due to the decline in usage caused by all these reasons, Adobe announced in July 2017 that they were phasing out support for Flash, and it would be fully retired by the end of 2020.
* Adobe Acrobat and Adobe Reader aren't much better either. Somehow, Ctrl+C to copy doesn't always work, despite the fact that this is one of the most basic features of any program that can display text. Sometimes it works, sometimes it silently fails, sometimes it fails and pops up an error message saying "An internal error has occurred", and sometimes it works but ''still'' pops up that error. And if you hit Ctrl+C four times in a row in a futile attempt to copy the same text, you might get one of each result despite the fact that nothing else changed between attempts. It's like playing a slot machine.\\
Then there's the fact that the installer likes to add bloatware to the system in the form of "helper" programs that start with Windows, stay running as background tasks, and which the main program runs fine without, and the equally unneeded "Adobe AIR". Google searches for "Adobe Reader without air" are very common.\\
Also, Linux versions were somewhat troublesome. Sometimes, when having several documents opened, you cannot change between them using the mouse -- same for using the menus. You have to "unlock" it from the document, by either clicking on the document, or using the keyboard to activate the menus. Adobe eventually stopped developing Adobe Reader for Linux, but that's not really a bad thing when less bloated PDF viewers, like Evince and Okular, exist (especially since those two also support other formats).\\
Early editions of Adobe Reader X would take forever to display a single PDF, take up huge amounts of processing power, and even had parts of the menu and toolbars disappear at random.
* Adobe Dreamweaver is known to crash when trying to load a file with a size that's an exact multiple of 8,192 bytes. [[https://forums.adobe.com/thread/417116?start=0&tstart=0#eightkb This is actually a known issue and documented on Adobe's official support forums.]] The recommended solution? ''Open up the file in a text editor and add a few characters to change the file size.''
* Despite Adobe Photoshop having been the standard tool for professional digital artists for years, later releases and developments have their increasingly disillusioned artist userbase investigating the products of competitors and open source projects.\\
For starters, the company has begun insisting upon billing the program- and going to great lengths optimizing it to be- an idiot-proofed meme-generator and photo-sharpener targeted toward the [[MoneyDearBoy lucrative]] "casual users with way too much money to burn who think [[MagicalComputer Photoshop is magic]]" market. As a result, the "exciting new features" added to Photoshop are frequently things like yet another way to remove red-eye, while [[PerpetualBeta basic glitches and quirks that artists have been wanting fixed for years remain ignored]]...\\
These problems include things like the ability to freely scroll the canvas in certain modes [[note]]which they did fix...on Macs. ''Some'' Macs.[[/note]], failure to update key aspects of the program like the "liquefy" tool to modern 64 bit multicore operation, and implementation of fundamental industry-standard utilities like ''total'' keybinding control, tiled windows in full screen mode, and changing the selected tool when the user changes tablet styluses[[note]]like nearly every other freeware or purchase program has been able to since about 2009...right around the time when Adobe inexplicably ''removed'' the feature from Photoshop.[[/note]].\\
The best part? Those aforementioned "exciting new features" occasionally displacing, complicating, and/or generally eroding the performance of the SugarWiki/GeniusProgramming aspects of the program that endeared it to artists in the first place. Meanwhile, Adobe seems baffled by the fact that many artists have expressed a complete disinterest in trading their already-paid-for program for a rented one that introduces more such issues on a regular basis.

[[folder:Apple Should Have Thought Different]]
* Apple products, especially iTunes, have a habit of downloading updates, installing them, then leaving behind tens or even hundreds of megabytes (per update!) worth of temporary files which it doesn't clean up. Even if you update your iPod firmware, it'll leave behind the temporary files on the computer you used to update it. To add insult to injury, it leaves these files in its own application data directory instead of trying to look for a system-designated temporary directory, meaning any other program trying to find and clean up unneeded temporary files won't notice. It's like living with the worst roommate ever. To get rid of these wastes of space, you have to dig through your file system to find Apple's directory, look for the temporary directory within that, and delete the junk yourself.\\
It also always restarts the computer on Windows-based systems after applying the updates, without warning, even if it doesn't need the restart. It's annoying when you're leaving the updater in the background running and then all of your programs start to close all of a sudden.\\
Oh, and let's not forget the increasingly common issue where, for no discernible reason, any TV show episodes you bought from the iTunes store will suddenly stop playing at all, rendering just an audio free black screen, forcing a complete reinstall of iTunes and [=QuickTime=] just to get your TV shows watchable again.
* For some reason, when Apple was releasing Safari for Windows for the first time, it had a tendency to crash when attempting to bookmark a page. A basic web browser action, and it killed the program.
* It should be noted that in a hacker's convention contest, [=Pwn2Own=], Mac OS X was routinely the quickest to fall to an outside attack. And by quickest, less than a minute. A common entry point for exploits? Safari. Whether or not Apple is improving browser security remains to be seen.
* The original Apple II was one of the first home computers to have color graphics, but it had its share of problems:
** Steve Wozniak studied the design of the electronics in Al Shugart's floppy disk drive and came up with a much simpler circuit that did the same thing. But his implementation had a fatal flaw: the connector on the interface cable that connected the drive to the controller card in the computer was not polarized or keyed -- it could easily be connected backwards or misaligned, which would fry the drive's electronics when the equipment was powered up. Shugart used a different connector which could not be inserted misaligned, and if it were connected backward it wouldn't damage anything; it just wouldn't work. Apple "solved" this problem by adding a buffer chip between the cable and the rest of the circuit, whose purpose was to act as a multi-circuit fuse which would blow if the cable were misconnected, protecting the rest of the chips in the drive.
** The power switch on the Apple II power supply was under-rated and had a tendency to burn out after repeated use. Unlike the "fuse" chip in the disk drives (which was socketed), the power switch was not user-replaceable. The recommended "fix": leave the power switch "on" all the time and use an external power switch to turn the computer off. At least one vendor offered an external power switch module shaped to fit nicely behind the Apple II, but most users simply plugged their computer into a standard power strip and used its on/off switch to turn their equipment off.
** The [=AppleSoft=] BASIC interpreter (which was ported from the Microsoft BASIC interpreter used on the Altair and TRS-80 computers, and others) did not implement the Boolean operators (AND, OR and NOT) properly. On most other BASIC interpreters (as well as many other "mathematical" programming languages), a programmer could isolate individual bits in an integer using these operators, but not in [=AppleSoft=].
* The old Apple III was three parts stupid and one part hubris; the case was completely unventilated and the CPU didn't even have a heat sink. Apple reckoned that the entire case was aluminum, which would work just fine as a heat sink, no need to put holes in our lovely machine! This led to the overheating chips actually becoming unseated from their sockets; tech support would advise customers to lift the machine a few inches off the desktop and drop it, the idea being that the shock would re-seat the chips. It subsequently turned out that the case wasn't the only problem, since a lot of the early Apple [=IIIs=] shipped with defective power circuity that ran hotter than it was supposed to, but it helped turn what would have otherwise been an issue that affected a tiny fraction of Apple [=IIIs=] into a widespread problem. Well, at least it gave Website/{{Cracked}} something to joke about.
** A lesser, but still serious design problem existed with the Power Mac G4 Cube. Like the [=iMacs=] of that era, it had no cooling fan and relied on a top-mounted cooling vent to let heat out of the chassis. The problem was that the Cube had more powerful hardware crammed into a smaller space than the classic [=iMacs=], meaning that the entirely passive cooling setup was ''barely'' enough to keep the system cool. If the vent was even slightly blocked however, then the system would rapidly overheat. Add to that the problem of the Cube's design being perfect for putting sheets of paper (or worse still, books) on top of the cooling vent, and it gets worse. Granted, this situation relied on foolishness by the user for it to occur, but it was still a silly decision to leave out a cooling fan (and one that thankfully wasn't repeated when Apple tried the same concept again with the Mac Mini).
** Another issue related to heat is that Apple has a serious track record of not applying thermal grease appropriately in their systems. Most DIY computer builders know that a rice-grain-sized glob of thermal grease is enough. Apple pretty much caked the chips that needed it with thermal grease.
** Heat issues are also bad for [=MacBook=] Pros. Not so much for casual users, but very much so for heavy processor load applications. Since the MBP is pretty much ''de rigeur'' for musicians (and almost as much for graphic designers and moviemakers), this is a rather annoying problem since Photoshop with a lot of images or layers or any music software with a large number of tracks WILL drive your temperature through the roof. Those who choose to game with a MBP have it even worse -- ''VideoGame/WorldOfWarcraft'' will start to cook your MBP within 30 minutes of playing. Especially if you have a high room temperature. The solution? Get the free software programs Temperature Monitor and [=SMCFanControl=]. Keep an eye on your temps and be very liberal with upping the fans: the only downsides to doing so are more noise, a drop in battery time, and possible fan wear - all ''far'' better than your main system components being fried or worn down early.
* Apple's made a big mistake with one of their generations of the [=iPhone=]. Depending on how you held it, it could not ''receive signals''. The iPhone 4's antenna is integrated into its outside design and is a bare, unpainted aluminum strip around its edge, with a small gap somewhere along the way. To get a good signal strength it relies on this gap being open, but if you hold the phone wrong (which "accidentally" happens to be the most comfortable way to do so, especially if you're left-handed), your palm covers that gap and, if it's in the least bit damp, ''shorts'' it, rendering the antenna completely useless. Lacquering the outside of antenna, or simply moving the air gap a bit so it doesn't get shorted by the user's hand, would've solved the problem in a breeze, but, apparently, Apple is much more concerned about its "product identity" than about its users. Apple [[NeverMyFault suggested users to "hold it right"]]. As it turns out, Apple would soon be selling modification kits for $25 a pop, for an issue that, by all standards, should have been fixed for free, or even before the product hit the market. Apple got sued from at least 3 major sources for scam due to this.
* Macbook disc drives are often finicky to use, sometimes not reading the disc at all and getting it stuck in the drive. The presented solutions? Restarting your computer and holding down the mouse button until it ejects. And even ''that'' isn't guaranteed, sometimes the disc will jut out just enough that the [[NoSell solution won't register at all]] and pushing it in with a pair of tweezers finishes the job. To put this in perspective, technologically inferior ''video game consoles'' could do a slot-loading disc drive far better (UsefulNotes/{{Wii}}, UsefulNotes/PlayStation3).
* Apple Maps, which came pre-packaged in ObviousBeta form with iOS 6, [[http://www.guardian.co.uk/technology/2012/sep/20/apple-maps-ios6-station-tower had several glaring problems right out the bat.]] The satellite imaging technology was prone to inexcusable error, from odd patchworks of pictures taken under markedly different conditions (some of which wouldn't even be considered usable, as they were covered in clouds or very low-quality) to severely misshapen landscapes and buildings. The map function frequently misplaced entire cities (and, on at least one occasion, ''continents''), was outright missing countless locations (and creating several new ones), and the search and GPS were just plain broken. It has become the topic of ridicule, with [[http://theamazingios6maps.tumblr.com/ entire websites]] dedicated to mocking its shortcomings. Even the London Underground got in on it, with a [[http://thedroidguy.com/2012/09/london-underground-attempts-to-solve-ios-6-maps-issue/ sign]] reading "For the benefit of passengers using Apple iOS 6, local area maps are available from the booking office."\\
A few, hearing about the problems with the new Maps app, ''refused to upgrade to iOS 6'' until Google came out with a third-party Google Maps app for the platform (replicating the old app's functionality, if not its interface).\\
And that's only the start of the iPhone 5's troubles. People are reporting a purple flare when taking photos. Apple's advice? "You're just holding it wrong."
* Tired of the problems with the Microsoft Task Manager? Apple's is even worse. It's accessed from the "Apple menu" that always appears on the left hand side of the top menu bar. But since in UsefulNotes/MacOS the top menu bar is always controlled by the selected window, a program can crash in a way that prevents you opening that menu to close the crashed program. Oh, and also just like Windows, Finder and Dock have no special privileges so broken software can prevent you doing the most basic file manipulation.
* Apple Music, at launch, had an unwanted "feature" -- [[https://blog.vellumatlanta.com/2016/05/04/apple-stole-my-music-no-seriously/ it scans your local files, deletes any music that matches what it thinks you have for what it has, and if it doesn't have it, it will upload a copy to its servers into the AAC format, regardless of the original encoding]]. Which means not only is Apple deleting files off your drive, if you're an independent musician or composer, Apple ''steals'' your music. All because Apple thinks if you wanted to listen to your music from the cloud, there's no point in having a local copy, right? Never minding that you need a Wi-Fi connection for this to work. Oh, and if your subscription runs out, you can't access it, ''even for your own actual music''.
* [[BrokenBase Say what you will about the iPhone 7's merging of the headphone and charger jacks]], but there's no denying that upon closer inspection, this combined with the lack of wireless charging creates a whole new problem. As explained in [[https://www.youtube.com/watch?v=0vYfAqEyGV0 this video]], the charger jack is only capable of withstanding a certain amount of wear and tear (between 5,000 and 10,000 plugs, although only Apple themselves know the exact number). Because you're now using the same jack for two different things, chances are you'll wear it out twice as fast as any other iPhone. Because the phone doesn't have a wireless charging function like most other phones, this means that if this happens, your phone is pretty much toast.

[[folder:Google's Not Feeling So Lucky]]
Even Google needs to search for a clue at times.
* In an effort to restore the tainted image of Internet Explorer, Microsoft has touted that it's more power-efficient than competing browsers. They mostly weren't taken seriously... [[StrawmanHasAPoint until Google admitted to not bothering to patch a bug that's caused the Windows version of Chrome to guzzle battery life for years]].[[note]]To explain: Windows in Vista, 7, and 8 use a default timer precision of 15.6ms. Programs that are waiting on something will schedule a timer based on this precision. Programs can adjust the precision to be smaller, down to 1ms. Google Chrome did just that and kept the timer firing off even though it probably didn't need it. The side effect is that Windows can give the system less time to idle.[[/note]]
** Several years later after the launch of Windows 10 and IE being replaced with Edge, [[https://blogs.windows.com/windowsexperience/2016/06/20/more-battery-with-edge/ Microsoft did actual scientific tests to back up their claim]]. Edge won overall with Opera and Firefox not too far behind but Chrome was a '''distant''' last place. One test involved four identical Surface Books playing the exact same endless video until the battery died. Edge lasted a full 3 hours longer than Chrome.
* Unlike most other browsers, Google Chrome runs each tab in a separate process. As any programmer could tell you, the immediate downside of this is extraordinarily high RAM usage, as Chrome is essentially running a duplicate version of itself for every tab you have open. The ostensible purpose of this is stability; if something goes wrong in one tab (such as Flash crashing--see its entry on this page) it won't bring down the entire session. However, anyone who tries to run Chrome alongside any other program besides maybe Windows Explorer without a near-bottomless well of RAM and gets the "tab/Google Chrome has crashed" error messages on a daily basis is probably questioning whether this design decision was worth the trade-off, or even whether it achieves its intended purpose.
* For some reason, if you attempt to save a picture from Website/{{Twitter}} to your computer from Chrome, it will be assigned the nonexistent file type ".jpg-large". The kicker is that if you manually insert the .jpg, the picture will save just fine. It's even worse on Android Chrome, where you can't save pictures from Twitter at all.
** This appears to be caused by the way Twitter handles pictures. When you click an image in your twitter feed, it appends ":large" to the URL ([[https://pbs.twimg.com/media/C92VACaV0AAZG7V.jpg:large example]]). Some browsers have no idea how to handle this (Chrome will try to name the example [="C92VACaV0AAZG7V.jpg-large"=], while Firefox will try to name it [="C92VACaV0AAZG7V.jpg large.jpg%20large"=]). Other sites that use URL postfixes can have similar problems.
* So what do you do when you've had enough and want answers from Google Customer Service? Call them? I hope you weren't planning to do that over Google Voice, because the application is incompatible with the Google Customer Service line.
* Android: If you for some reason have to reset your device, you have to use your Google account to unlock it. This sounds fine, if it wasn't for the fact that it forces you to use your old password and doesn't accept a more recent one.
* Android's power-management system makes it highly vulnerable to rogue applications. To explain it easily: by itself the system always tries to go into deep sleep, where everything is as powered-down as it can be to save battery life -- unless an app tells the system "hey, please wake up, I'd like some CPU power to do stuff" using something called a wakelock. Well-made apps make sparing use of wakelocks -- for instance, to check messages every few seconds by very briefly waking up the phone. But a badly made app can request wakelocks continuously, keeping your phone awake and drawing battery power uselessly. There is ''no easy way to prevent this'' in current versions of Android. An experienced user can install all sorts of monitoring applications and try to hunt down and delete (or freeze temporarily) the accursed rogue app, but it's often a difficult and frustrating task, made worse by the fact that applications that used to behave fine can break during an upgrade and become rogue. Importantly, the hunt-down-the-power-sucker rigmarole is not a basic-level operation, and a lot of average users don't even know it exists -- leading many to complain about the battery life of handsets that really should be lasting a lot longer.
** Often, the rogue app is ''an integral part of Android''. The backup feature, for instance, likes to get stuck and stay awake trying to sort itself out -- for eternity, if not disabled manually. Ancillary Android apps show up as a bundled-together "Android System" entry in the stock power manager (making it impossible to know ''what'' part of the system is messed up without, again, third-party monitors); a quick Google search for "Android System battery drain" will show rather effectively how often Android pole-axes its own power management.

[[folder:Hewlett-Packard: Inventing New Headaches]]
Why does ink have to be more [[CrackIsCheaper expensive]] than fine wine to realize a profit margin?
* Famously, the "PC LOAD LETTER" message you'd get on early HP Laserjets has been elevated as an example of confusion in user interfaces. Anyone without prior knowledge would assume something is wrong with the connection to the PC, or something is off in the transfer of data ("load" being interpreted as "upload"), and that the printer is refusing the "letter" they're trying to print. What it actually means is "load letter-sized paper into paper cassette"; why the printer wasn't simply programmed to say "OUT OF PAPER" is a RiddleForTheAges.
* HP's printers, scanners, and fax machines come with software to go with the hardware. All of that software except the printer's drivers are optional, but HP does everything short of flat-out lying to suggest that the other crap is required. If you do install the optional software, the stupid thing will pop up a "Error: Your HP printer is disconnected" every time you turn off the printer.
* Some HP computers come with batteries or power supply units that are known to ''[[MadeOfExplodium explode]]''. Literally, with no exaggeration, [[https://www.youtube.com/watch?v=nHPf6jHdEFg they release sparks and smoke]] (and [[http://h30499.www3.hp.com/t5/Business-PCs-Deskpro-EVO/Power-supply-D530-SFF-explodes/td-p/1148930#.UAYwpfWDmSp this is a "known issue"]]). Others [[http://www.pcworld.com/article/248713/hp_recalls_laptops_over_faulty_batteries.html overheat]] [[http://www.cpsc.gov/cpscpub/prerel/prhtml09/09221.html and burst]] [[http://consumerist.com/2009/05/hp-adapter-catches-fire-burns-pants-execs-ignore.html into flames]]. And there have been ''multiple'' [[http://consumerist.com/2010/05/54000-more-hp-batteries-recalled.html recalls]], proving that they obviously didn't learn from the first one.
* In UsefulNotes/TheNewTens, HP and several other printer companies started using DRM in their printers, of all devices. These were meant to block refilled cartridges in an attempt to exploit printer ink's ridiculously high markup rates. However, they're notoriously inaccurate--which means they randomly reject new cartridges fresh out of the box, ostensibly for no reason.
** This is because part of the system is a microprocessor on the ink tanks that will completely brick the cartridge after a certain date. That's right -- ink cartridges now ship with an ''expiration date'' that the retailer sees on the shipment box, but since they're not "perishables" or food items, they're not required to display this information on their packaging.
* The Hewlett-Packard support assistant that is equipped with many of their computers, such as the Pavilion, is very poor at automating automatic updates of included software such as Cyberlink's utilities and HP's own service software. Batch updating often results in several updates failing to install, requiring the user to manually download the failed updates themselves.
* In an effort to better suit beginning users, Hewlett-Packard includes bundles of software on their pre-configured computers (including the infamous Norton Security software), that adds functionality like CD and DVD authoring tools, some [[{{Shareware}} Wild Tangent game trials]], and other stuff. Unfortunately, if you're a power user who wants to only install your preferred software, this software just gets in the way, and can potentially interfere in the smooth operation of Windows. Worse, if you took Microsoft's free offer to upgrade to Windows 10, the upgrade may not go as expected, leaving a barely usable system with slow-downs and even fatal crashes. Fortunately, if your system is a disaster, you can use a software tool (such as "Produkey") to copy down your installation-key, download a Windows Media Creator to make a boot-able installation DVD and install a clean copy of Windows 10 without the extra bloat on the hard drive and processor.
* Some of HP's printers come with a feature called HP Smart Install. Ostensibly, this is supposed to make the printer quick and easy to setup by utilizing an autorun program to install its drivers. However, most computers will simply detect the printer as a CD drive when plugged in, and even if the setup program is run manually it will fail halfway through. In fact, going into the printer's menu (if it has a physical control panel) and ''disabling'' Smart Install is the better option, as the computer will search for the drivers online and install them itself with the added bonus of not installing the extra, unnecessary software included with the Smart Install package.
* Some HP laptops have a feature called "Action Keys". On most laptops, doing things like adjusting volume or screen brightness would require you to hold the "fn" key and press a certain function key at the same time. Since the "fn" key is normally at the bottom of the keyboard and the function keys are normally at the top, this could get annoying or [[SomeDexterityRequired uncomfortable]], so HP had the idea to switch this around: pressing a function key on its own would adjust volume or brightness, and users who actually want to press the function key can do so by holding down the "fn" key. Since most computer users adjust the volume or brightness much more often than they need to press F6, this is nice, but what if you do need to use the function keys a lot (for example, when programming and using debug mode)? Hopefully HP included an option to conveniently switch Action Keys on and off, right? You wish. Disabling or re-enabling the Action Keys requires you to turn off the laptop, then switch it back on while holding F10 to open up the BIOS menu, where you can find the option. Each time you want to switch between these modes, you need to reset your laptop, which is a massive inconvenience. And even worse, screen-reading software does not work in the BIOS menu, which makes changing these options almost impossible for visually impaired users. On the bright side, key combinations like Alt+F4 still work, but why HP didn't just make this easy to switch on and off is a mystery.

[[folder:Intel - Suffer What's Inside]]
Being a leader in processor development doesn't mean everything worked out the first time.
* The "Prescott" core Pentium 4 has a reputation for being pretty much the worst CPU design in history. It had some design trade-offs which lessened the processor's performance-per-clock over the original Pentium 4 design, but theoretically allowed the Prescott to run at much higher clockspeeds. Unfortunately, these changes also made the Prescott vastly ''hotter'' than the original design, making it impossible for Intel to actually achieve the clockspeeds they wanted. Moreover, they totally bottlenecked the processor's performance, meaning that Intel's usual performance increasing tricks (more cache and faster system buses) did nothing to help. By the time Intel came up with a new processor that put them back in the lead, the once hugely valuable "Pentium" brand had been rendered utterly worthless by the whole Prescott fiasco, and the new processor was instead called the Core 2. The Pentium name is still in use, but is [[DemotedToExtra applied to the mid-end processors that Intel puts out for cheap-ish computers]], somewhere in between the low-end Celerons and the high-end Core line.
** While the Prescott iteration of the design had some very special problems of its own, the Pentium 4 architecture in general had a rather unenviable reputation for underperforming. The design was heavily optimised in favour of being able to clock to high speeds in an attempt to win the "megahertz war" on the grounds that consumers at the time believed that higher clock speed = higher performance. The sacrifices made in the P4 architecture in order to achieve those high clock speeds, however, resulted in very poor performance per tick of the processor clock. For example the processor had a very long instruction decode pipeline [[note]]a pipeline is a device that breaks the fetch/decode/execute cycle into subtasks that can be executed in an assembly line style so that while one instruction is being sent to the ALU for execution, the next in the sequence is being decoded and the one after that is being fetched, allowing 1 instruction per clock cycle instead of 1 instruction taking several cycles[[/note]], which was fine if the program being executed didn't do anything unexpected like jump to a new instruction, but if it did it would cause all instructions in the pipeline to be discarded, stalling the processor until the new program execution flow was loaded into the pipeline, and because the pipeline was a lot deeper than the previous Pentium III, the processor would stall for several clock cycles while the pipeline was purged and refreshed. The science of branch prediction was in its infancy at that point, so pipeline stalls were a common occurrence on Pentium 4 processors. This combined with other bone-headed design decisions like the omission of a barrel shifter[[note]]a device that can do shift operations to an arbitrary number of places in a single clock cycle, as opposed to shifting a single step per clock cycle until the desired result is achieved[[/note]] and providing multiple execution units but only having one be able to execute per clock cycle under most circumstances meant that the contemporary Athlon processor from AMD could eat the P4 alive at the same clock speed due to a far more efficient design (The last problem was partially solved with the concept of "hyperthreading", presenting a single core processor to the OS as a 2 core processor, and using some clever trickery in the chip itself to allow the execution units that would otherwise sit idle to execute a second instruction in parallel provided it meets certain criteria).
* The Prescott probably deserves the title of worst ''x86'' CPU design ever (although there might be a case for the 80286), but allow us to introduce you to Intel's ''other'' CPU project in the same era: the Itanium. Designed for servers, using a bunch of incredibly cutting-edge hardware design ideas. Promised to be incredibly fast. The catch? It could only hit that theoretical speed promise if the compiler generated ''perfectly optimized'' machine code for it. Turned out you ''couldn't'' optimize most of the code that runs on servers that hard, because programming languages suck [[note]]''Every instruction typed in requires the computer to "think" about it, taking processor speed; optimization simply means the computer needs to think less overall to get the same result''[[/note]], and even if you could, the compilers of the time weren't up to it. Turned out if you ''didn't'' give the thing perfectly optimized code, it ran about half as fast as the Pentium 4 and sucked down twice as much electricity doing it. Did we mention this was right about the time server farm operators started getting serious about cutting their electricity and HVAC bills?
** Making things worse, this was actually Intel's ''third'' attempt at implementing such a design. The failure of their first effort, the [=iAPX-432=] was somewhat forgivable, given that it wasn't really possible to achieve what Intel wanted on the manufacturing processes available in the early eighties. What really should have taught them the folly of their ways came later in the decade with the [=i860=], a much better implementation of what they had tried to achieve with the [=iAPX-432=]... which still happened to be both slower and vastly more expensive than not only the 80386 (bear in mind Intel released the 80'''4'''86 a few months before the [=i860=]) but also the [=i960=], a much simpler and cheaper design which subsequently became the EnsembleDarkhorse of Intel, and is still used today in certain roles.
** In the relatively few situations where it gets the chance to shine, the Itanium 2 and its successors can achieve some truly awesome performance figures. The first Itanium on the other hand was an absolute joke. Even if you managed to get all your codepaths and data flows absolutely optimal, the chip would only perform as well as a similarly clocked Pentium '''III'''. Even Intel actually went so far as to recommend that only software developers should even think about buying systems based on the first Itanium, and that everyone else should wait for the Itanium 2, which probably ranks as one of the most humiliating moments in the company's history.
*** The failure of the first Itanium was largely down to the horrible cache system that Intel designed for it. While the L1 and L2 caches were both reasonably fast (though the L2 cache was a little on the small side), the L3 cache used the same off-chip cache system designed three years previously for the original Pentium II Xeon. By the time the Itanium had hit the streets however, running external cache chips at CPU speeds just wasn't possible anymore without some compromise, so Intel decided to give them extremely high latency. This proved to be an absolutely disastrous design choice, and basically negated the effects of the cache. Moreover, Itanium instruction are four times larger than [=x86=] ones, leaving the chip strangled between its useless L3 cache, and L1 and L2 caches that weren't big or fast enough to compensate. Most of the improvement in Itanium 2 came from Intel simply making the L1 and L2 caches similar sizes but much faster, and incorporating the L3 cache into the CPU die.
* Intel Atom was not that far. The first generation (Silverthorne and Diamondville) were even slower than a Pentium III. Yeah, despite having low power consumption, the CPU performance was awful. To make it worse it only had support for Windows XP and it was still lagging. Following generations prior Bay Trail were mere attempts to be competitive, but sadly they were even slower than a VIA processor (and that was considered as a slow chip in their time).
** Averted wonderfully with Bay Trail, which managed to defeat VIA chips and be equal to AMD low tier chips (which had a very awful moment with Kabini and Temash) while having decent GPU performance. Sadly the next iteration, Cherry Trail/Braswell got hit in a brutal way: Intel Atom's team had the 'best' idea to improve GPU performance (to reach AMD Beema levels)... by sacrificing CPU one. And when they realized that doing that at the end made both lose performance, they tried to improve Turbo speeds, but at the end it just got worse due the constrained power consumption, making this generation a massive regression from Bay Trail.
** And then there is Intel [=SoFIA=]. They have success in the mobile area with Moorefield (Bay Trail chips with Power VR GPU) and people were expecting to continue with that path, but they decided to use Mali GPU instead with constrained x86 chips in order to reduce cost. Sadly their initial chips were as fast as the ''slowest'' ARM chip (which was the ARM A7) available at the time and even worse. That ARM slowests chips were being replaced by A53/A35 chips as the new low tier, making Intel being far behind. No wonder why they cancelled [=SoFIA=].
* While Intel's CPU designers have mostly been able to avoid any crippling hardware-level bugs since the infamous FDIV bug in 1993 (say what you will about the Pentium 4, at least it could divide numbers correctly), their chipset designers seem much more prone to making screw-ups:
** Firstly there was the optional Memory Translator Hub (MTH) component of the 820 chipset, which was supposed to allow the usage of more reasonably priced SDRAM instead of the uber-expensive RDRAM that the baseline 820 was only compatible with. Unfortunately the MTH basically didn't work at all in this role (causing abysmally poor performance and system instability) and was rapidly discontinued, eventually forcing Intel to create the completely new 815 chipset to provide a more reasonable alternative for consumers.
** Then there were the 915 and 925 chipsets; both had serious design flaws in their first production run, which required a respin to correct, and ultimately forced Intel to ditch the versions they had planned with wi-fi chips integrated into the chipset itself.
** The [=P67=] and [=H67=] chipsets were found to have a design error that supplied too much power to the SATA [=3Gbps=] controllers, which would cause them to burn out over time (though the [=6 GBps=] controllers were unaffected, oddly enough).
** The high-end [=X79=] chipset was planned to have a ton of storage features available, such as up to a dozen Serial SCSI ports along with a dedicated secondary DMI link for storage functions... only for it to turn out that none of the said features actually worked, meaning that it ended up being released with ''less'' features than its consumer counterparts.
** A less severe problem afflicts the initial runs of the [=Z87=] and [=H87=] chipsets, in which USB 3.0 devices can fail to wake up when the system comes out of standby, and have to be physically disconnected and reconnected for the system to pick them up again.
** Speaking of the 820 chipset, anyone remember [[https://en.wikipedia.org/wiki/RDRAM RDRAM]]? It was touted by Intel and Rambus as a high performance RAM for the Pentium III to be used in conjunction with the 820. But implementation-wise, it was not up to snuff (in fact benchmarks revealed that applications ran slower with [=RDRAM=] than with the older [=SDRAM=]!), not to mention very expensive, and third party chipset makers (such as [=SiS=], who gained some fame during this era) went to cheaper DDR RAM instead (and begrudgingly, so did Intel, leaving Rambus with egg on their faces), which ultimately became the de facto industry standard. [=RDRAM=] still found use in other applications though (like the Nintendo 64 and the [=PlayStation=] 2).
*** A small explanation of what happened: Rambus [=RDRAM=] memory is more serial in nature than more-traditional memory like [=SDRAM=] (which is parallel). The idea was that [=RDRAM=] could use a high clock rate to compensate for the narrow bit width ([=RDRAM=] also used a neat innovation: dual data rate, using both halves of the clock signal to send data; however, two could play that game, and [=DDR SDRAM=] soon followed). But there was two problems. First, all this conversion required additional complex (and ''patented'') hardware which raised the cost. Second, and more critically, this kind of electrical maneuvering involves conversions and so on, which adds latency...and memory is one of the areas where latency is a key metric: the lower the better. [=SDRAM=], for all its faults, operated more on a KeepItSimpleStupid principle, and it worked, and later versions of the technology introduced necessarily complexities at a gradual pace (such as the [=DDR2/3=] preference for matched pairs/trios of modules): making them more tolerable.
* Intel's [=SSDs=] have a particular failure mode that rubs people the wrong way. After a certain amount of writes, the drive goes into read only mode. This is great and all until you consider that the number of writes is often lower compared to other [=SSDs=] of similar technology (up to about 500 terabytes versus 5 petabytes) and that you only have '''one chance''' to read the data off the drive. If you reboot, the drive then goes to an unusable state, regardless of whether or not the data on the drive is still good.

[[folder:Microsoft Doesn't Take You Where You Want to Go Today]]
With great market share comes great exposure to malicious coders. Keep those virus definitions updated, and Windows Updates downloading.
* Microsoft in general tends to have a lot of problems with [[https://en.wikipedia.org/wiki/Not_Invented_Here Not Invented Here]] and an inability to let go of problematic legacy code and designs (often because [[http://blogs.msdn.com/b/oldnewthing/archive/2003/12/24/45779.aspx lots of third-party software relies on the problematic behavior]]) in their own unsung examples of Idiot Programming.
* In fact, similar to The Daily WTF, Microsoft veteran Raymond Chen's blog [[http://blogs.msdn.com/b/oldnewthing/ The Old New Thing]] is a good place to look for explanations of things that at first seem to be Idiot Programming on Microsoft's part. As Chen puts it, ''[[http://blogs.msdn.com/b/oldnewthing/archive/2010/08/16/10050336.aspx "no matter what you do, somebody will call you an idiot"]]''.
** As of October 2010, [[http://www.testtrack4.com/cracked/appcompatlist.txt this is a list]] of programs that can be described as "we found this program doing stupid shit and have to work around it". One of the reasons Vista was so poorly received was because a lot of programs that did stuff they shouldn't have done wouldn't work properly. "Properly" being based on guidelines formed around 2001 and enforced in 2007.
* Mac Word 6. So legendarily bad, the Windows version ran faster ''in an emulator''. Bear in mind that this was back when [=PC=]s ran x86 code and Macs were on the '''68k''' architecture. Exacerbated by the quality of its immediate predecessor, Mac Word 5.1, often regarded even today as Microsoft's finest work and possibly the best word processor ever written. The troubles of Word 6 were mainly due to an attempt to make it universally coded (i.e., both Mac and PC-friendly), but the result was a slow, memory-intensive clunker.
* Versions of Word as recently as 2003 have had a document filesize limit of 32 megabytes, even though that could be reached by a document with 30 reasonably sized photos embedded in it.
* Despite its beautiful and fairly responsive user interface, Windows Live Mail has several glaring flaws, such as pegging the CPU for a full minute to fetch mail from an IMAP server and popping an error message ''every time'' it is disconnected, even though it should be a ForegoneConclusion if you leave a dormant connection lying open for several minutes with no activity.
* Every version of UsefulNotes/MicrosoftWindows gets this when it first comes out (except, strangely, for Windows 7), but Windows Vista and Windows ME have had the highest amounts of backlash. The common belief now is that most of Windows Vista's bad reception came from a sub-optimal initial release, which had a number of serious bugs relating to file transfers and networking (they mostly caused speed problems rather than data corruption ones, but it made using pre-[=SP1=] versions of Vista a pain in the backside). Most of the serious problems were fixed with the first service release, but Vista's reputation, which had already been dented by its failure to live up to Microsoft's early promises, never really recovered. Even after applying service packs, Vista is still very noticeably dog-slow compared with Windows 7/8/10, especially on budget Pentium D systems and comparable AMD chipsets.
** Windows ME, on the other hand, was arguably the worst operating system (apart from the infamously broken MS-DOS 4.00) ever released by Microsoft, to the extent that geeks have been wondering for years whether it was some kind of SpringtimeForHitler plot to make the upcoming NT-based "Whistler" (what would subsequently become Windows XP) look better. Perhaps the biggest problem was that the memory management system was so catastrophically broken that up to a '''third''' of any given system's RAM was often rendered unusable due to memory leaks. Moreover, System Restore (which would become a well-loved feature in XP and beyond) severely hurt performance, not helped by the aforementioned memory management problem, and would quite often fail to restore your documents and important files, but ''did'' restore viruses and malware.
*** A particularly facepalm worthy bug: ME, for the first time, supported Zip files without an external program. This was back in an era when diskettes were still ubiquitous, so the use case of a Zip file spanned across multiple diskettes was not a particularly uncommon situation. Opening a spanned archive would result in a prompt for the first diskette... and it would keep asking you until you produced that diskette. If it was lost, reformatted or damaged you were out of luck because there was no way to cancel out of that dialog box and no way to terminate it without terminating Explorer.
*** Another stunning Windows ME design decision that seems brilliant on paper but turned out to be terrible in execution: Originally in Windows 3.x and 9x, if a program was stalling, you could press CTRL+ALT+DEL to interrupt it, and pressing the buttons again would force a reboot. In Windows ME, CTRL+ALT+DEL became the key combination to open up the Task Manager instead, but pressing the buttons again would still force a reboot. However, since this is ''Windows ME'' we're talking about, there was absolutely no way to tell if the computer was majorly lagging or if it didn't read the input at all. And, of course, if the system '''is''' lagging, it still keeps track of which buttons you've pressed. So, naturally, when you push CTRL+ALT+DEL to open up the Task Manager just to shut down a program that isn't working right, the Task Manager doesn't appear. Wait thirty seconds. Still nothing, so you push CTRL+ALT+DEL again. After another ten seconds, it finally pops up. You go to close the program, and then...."Windows is shutting down."
*** ME's biggest problem was that it supported two driver types: the new, NT-style [=DLLs=] we're all comfortable with today, and the old [=VxDs=] that originated way back in the days of Windows 2.1. The idea was to phase out the old drivers while maintaining backward compatibility; alas, what actually happened is that the two driver types didn't play nice with each other. As a result, if you needed a combination of the two driver types for your computer (which almost everybody did) ME crashed often and with great enthusiasm. However, if you were one of the lucky few whose hardware only needed one driver type, [=WinME=] actually worked kind of decently -- hence the occasional user who can't quite figure out why everybody hated ME when it worked so well for them.
** Windows XP was pretty decent in most aspects when it was released...except for the OS's security, which was broken beyond belief, even if it wasn't obvious at the time of release. This owed to an attempt to make it backwards-compatible with the past four [=OSes=]. Famously, it was demonstrated that if you installed the RTM build of XP on a computer in mid-2007 and browsed the internet for just an hour, the OS would be hopelessly corrupted by viruses and malware, to the point of requiring a complete reformat and reinstall of the system.
*** This was exacerbated by the decision to assign the main user account administrative privileges, opening up many ways to potentially ruin the operating system. This helped the operating system to behave more like Windows 98 for compatibility; a very risky choice, leaving an uncontested back-door open to attacks, that required patch after patch to remedy.
** Vista was particularly hilarious in the way it restructured so many things that Microsoft actually had to set up workshops to teach people how to use it; customers found these workshops very helpful. Snarkers were quick to pick up on the fact that Vista was perfectly intuitive, provided you had a trained expert holding your hand every step of the way.
** A major annoyance for new Vista[=/=]7 users who migrate from XP: the Read Only bug. Any hard disk with a NTFS file system that was created in XP that gets imported into a Vista[=/=]7 system will by default have all files and folders stored in it as read only, even for a user with administrative privileges, and even if one uses the "take ownership" feature. The solution would be go to the Security properties of each and every file and folder (the fastest way would be to go to the file system's root directory and select all files, and apply the following steps to all child files and folders), add the current user account to the list, declare it the owner, and grant all privileges.
** Windows Update may sometimes screw up the bootloader on the hard drive, necessitating a reinstall of sorts to fix this. Particularly annoying if the Windows install didn't install the recovery environment and thus, a way to ''fix'' the computer. Also, the occasional Windows update seems to not regard the drive letter of the Windows installation they're being installed on, instead just selecting the earliest drive letter with a windows installation on it, or some other arbitrary means. This is evidenced by trying the update, having it fail, turning off the computer and physically disconnecting the other drive, trying again, and the update works.
** Users that updated to Windows 8.1 via the Microsoft Store noticed one crucial feature about Windows 8 that was really nice to have: Refresh. This way you can effectively "reinstall" Windows without having an install disk. The problem? It actually relied on a special file (presumably an ISO copy of the install disk) and the 8.1 update failed to include this, making it impossible to use the Refresh command. Whoops. Though some people have figured out a way around this.
** Windows Vista introduced the [=ProgramData=][[note]]formerly '''C:\Documents and Settings\All Users''' and '''C:\Documents and Settings\All Users\Application Data''', which now redirect to it[[/note]] folder and encouraged application developers to store system data there, with adding content in many applications simply involving dragging and dropping new files in to read. Cue Windows 8.1 making the directory writable only by admins...
** Windows Update on many machines (from Vista through 8.1) may sometimes install a driver update for Ethernet and Wi-Fi adapters. This would not be significantly problematic, save that the drivers installed are produced by ''Microsoft'' instead of Atheros or the actual hardware producer, and that after installation, the affiliated network connections do not properly resolve DHCP. DHCP, or the Dynamic Host Configuration Protocol, is what allows computers to obtain an IP address on their local network, determine how to route connections, and ''resolve connections'' through DNS, meaning that said computers are no longer able to resolve 'google.com' as, say, Have fun replacing your drivers without a System Restore!
** A notorious problem is that Task Manager, the interface primarily used to halt crashed programs, became a standard program itself from Windows Vista onward. This means that it can be impossible to start Task Manager if the crashed program is "spin locking" (constantly using the CPU without achieving anything) because Task Manager has no special claim on the CPU compared to the crashed program. Task Manager also tries to place itself on top of other windows, but again has no special claim on this, so can be forced to the back by full screen or aggressive programs -- an exploit frequently used by malware. Ironic because in Windows 3.1, Task Manager took priority over absolutely everything, although this may have been because of the much simpler multitasking architecture and lack of DMA.
*** Likewise, Windows Explorer (the software that provides the desktop interface) is considered just another program, not part of the operating system. This means that even fundamental tasks like starting programs and moving and copying files can be disrupted by other programs.
* The Zune software. The interface is fine, but at first, it devoured RAM and took up way too much CPU power for what it does. But each subsequent release managed to improve performance, to the point where as of version 4 even machines that don't have much higher than the minimum spec can run it with all the visual effects turned on with little to no problem. iTunes, on the other hand, started at bloated garbage and got even more slow and bloated over time.
** The Zune player itself, on the other side, [[http://edition.cnn.com/2009/TECH/01/01/zune.player.failures/ had a leap-year glitch]], which made them freeze up on New Year's Eve of a leap year because of the clock driver [[http://www.zuneboards.com/forums/zune-news/38143-cause-zune-30-leapyear-problem-isolated.html screwing up on how it handles leap years]].
** The Zune was also incompatible with [=PlaysForSure=]-protected media. Microsoft apparently can't even maintain compatibility with its own stuff.
* Windows Live Hotmail. Until late 2010, despite being entirely capable of handling everything the application did, Opera and Chrome had to spoof as something else before Microsoft would allow access. They also had to supply their own scripts, because the ones served by the site itself were broken.
** In 2011, they've fallen prey to yet another issue. They're attempting to fight spam… by preventing it from leaving the user's draft box. Perfectly legitimate mail is often blocked, which they acknowledge, giving no clue for how to change it so you can send your message beyond "re-edit it so it looks less spam-like." "Spam-like" has, among other definitions, content-free subject lines such as "RE: How's it going?".
* Active Desktop was an optional Windows 95 update released in '97 in an effort to catch up with that "World Wide Web" thing that had taken Microsoft by surprise and capitalize on the new push technology hype (basically RSS feeds). The concept was ahead of its time: you could place webpages and things like weather and stock updates right on your desktop and bypass the browser altogether. It also gave your folders a spiffy overhaul, introduced the quick launch bar and made everything clickable look like hyperlinks. In fact, folder windows were browser windows and you could type both [=URLs=] and folder paths into the address bar. There was one problem (aside from the need to be constantly connected over pay-per-minute dialup to receive push updates): many user interface elements were essentially outsourced to your browser, and this was back when a crash in one browser window tended to take down all others with it. The browser was the paragon of instability known as Internet Explorer 4. You can see where this is going.
** Things got more sensible and less crash-prone in Windows 98, but the desktop components remained unstable all the way until Microsoft realized no one used the feature for exactly this reason and replaced it with desktop gadgets in Windows Vista.
* Microsoft Outlook uses one giant .pst blob for all emails, which tends to get corrupted once it reaches two gigabytes. [[http://support.microsoft.com/kb/296088 This page]] acknowledges this, and implies that it's the user's fault for exceeding the size limitation, since users have nothing better to worry about in their lives. Note how the error doesn't come up until ''after'' it's a potential problem, and the fix simply truncates the .pst to 2GB, destroying the messages that don't make the cut.
** Prior to Version 7, Exchange did the much the same thing: all mail for its users was stored in a single flat file on the Exchange server. This file was generally created in its entirety in advance and populated over time rather than constantly expanding. Problems: if the file became "fragmented" as users deleted messages, it would require a compression cycle, which required the server be taken offline for possibly hours. Additionally, if the file reached its limit, it would simply stop accepting new messages while acting to users like nothing was wrong. The file could be increased in size, but only to about 16GB (as of Exchange 5). The safest solution was to migrate to Exchange 7, which in and of itself is a nightmare that often required rebuilding the entire system to deal with the absolute requirement of Active Directory.
** Outlook 2007 had a very bizarre problem where merely having the font Helvetica installed on the PC would cause Outlook to crash whenever opening a new e-mail. Yes, really.
** Outlook 2013. Want to see your old messages in the inbox? Well, you can't! You have to click a damn link at the bottom of the screen "to see more". You just changed folders and want to return to the inbox? Guess what, you need to click on the damnable link again! There are no options to display all your messages in your inbox.
** You've inserted a joint file and have opened it up to make sure it's good before sending. Outlook will warn you that you're about to alter the file and ask you if you want to continue or not. If you keep saying no, it will not stop harassing you.
* Internet Explorer:
** Internet Explorer 6...[[note]](this technically applies to the earlier versions as well, but they were mostly out of use by the time the major issues became apparent)[[/note]] ''hoo boy''! At the time of its release it was widely seen as being a decent enough browser, though a lot of people complained that it hadn't really changed ''that'' much since version 4. What became obvious as the years progressed -- something made all the worse by Microsoft's decision to cease development on Internet Explorer and only upgrade it when what would eventually become Windows Vista was released; while they eventually backtracked on this policy and released it for older versions of Windows, the eventual Internet Explorer 7 wouldn't come until near the end of 2006, ''five years'' after [=IE6=]'s introduction -- was that it suffered from unbelievably poor security, to the point where new exploits were being found literally every day by the hacker community by early 2004. This, combined with Microsoft's stubborn refusal to do anything more than patch the most serious bugs, effectively restarted the Browser Wars which had largely been over since 1998, and saw Microsoft's >90% share of the browser market get demolished by Chrome, Firefox and the other browsers, to the point where only around 20% of PC users use Internet Explorer as their primary browser today (in comparison, over 40% of PC users use Chrome as their primary browser).\\
The transition from [=IE6=] to newer browsers wasn't helped by [=IE6's=] support for HTML standards somehow being even ''worse'' than its security, requiring massive re-coding of websites. In fact, some had to maintain separate [=IE6=]-compatible versions for several years after version 7 was released. On top of all of this, a ''ton'' of corporate intranet applications were designed with [=IE6=] in mind and simply didn't work at all on any other browser, forcing many big companies to stick with the browser (and, by extension, Windows XP) even to this day, thus still posing a massive security risk. There are several more things we could talk about, but the bottom line is, [=IE6=] is often regarded as being not just Microsoft's worst product, but arguably one of the worst tech products of all time.
** Internet Explorer 7 and 8 both talked up a "new commitment to standards and performance", with each one certifiably supporting more features than its predecessor, but each paling in comparison to every other browser available when released. [=IE7=] did fix some of the most severe bugs that [=IE6=] had suffered from, but the underlying engine was near-identical with most of the new features being cosmetic, and for the most part the browser was just as insecure and bug-ridden as its predecessor. [=IE8=] by comparison had a redesigned engine that fixed most of the security problems, but added a new problem in that it kinda sucked at rendering older websites. Microsoft tried to divert attention from this by hyping up its "Web Slices" and "Accelerators", both of which were features that only Internet Explorer supported, but all the other browsers could feel free to implement themselves! While this trick worked for Netscape during the first Browser War (until Microsoft ended it by fiat by bundling [=IE4=] with Windows 98), it didn't take this time around, and both versions languished in obscurity, hemorrhaging market share all the while.
** Internet Explorer 9, however, looks like Microsoft has learned their lesson in what they have to do, and that it's going to finally avert this, with development focusing exclusively on [=W3C=]-standardized features (as in [=HTML5=], [=CSS3=], and [=EcmaScript=] 5), many that every other browser already supports, and some that they don't- but only ones that are part of the World Wide Web Consortium-approved standards. Of the tests that Microsoft has submitted that [=IE9=] passes but other browsers fail, most are passed by at least one other browser, with some tests they've submitted [[http://samples.msdn.microsoft.com/ietestcenter/html5/MediaElements_harness.htm?url=media-rules-playback.3 not even passing in Internet Explorer 9, but passing in Opera or Safari]].
*** Many legacy features are finally being re-architected to match the reality of modern Web browsing, such as [[http://blogs.msdn.com/b/ie/archive/2010/08/04/html5-modernized-fourth-ie9-platform-preview-available-for-developers.aspx JavaScript being implemented as part of the browser]] instead of going through the generic Windows COM script interface that was introduced over a decade ago and used through [=IE8=].
*** To top it off, the [=IE9=] Platform Previews run completely platform-agnostic examples an order of magnitude ''faster'' than every other browser out there (by implementing hardware-accelerated video through Windows' new [=Direct2D=] API).
** From a plugin standpoint, [[http://blogs.msdn.com/b/ie/archive/2010/08/03/add-on-performance-part-1-measuring-add-on-performance.aspx get a load of these load times]]. The AVG toolbar adds, on average, a full second to the load time ''every time you create a new tab''. On top of lots of other boneheaded on-load hooks, one developer actually incorporated ''network calls'' to their addon's initialization routine, meaning that, until it received a response from a ''remote server'', it would ''block your tab from opening''.
** Not that the later, stabler versions of Internet Explorer don't have their share of irritating issues. IE has the ability to reopen your last browsing session after a crash. Seems fair enough... until you realize that this will not only reopen the tabs and windows you had open but anywhere from five to ''over thirty'' additional windows open to your home page, for no apparent reason.
*** It usually keeps track of any browser window that crashes, and opens all of them the next time you're prompted. Normally, this isn't a problem, but if you open IE by clicking a link (or double-clicking an internet shortcut on your desktop/start menu) while it's your default browser, you won't be prompted to re-open your last session. If this one crashes, too, it's added to the record. You can see where this is going. Open 20 browser windows like that and crash them, and then when you finally open one by just double-clicking the IE icon, and choose to restore your last session, and it brings all 20 back. A good practice, taken ''much'' too far.
* Microsoft Office 2007 has some neat features that were previously unavailable, but those features take a backseat to some of the problems it has (take note that at least all of the known problems was fixed by the time Office 2010 is released):
** The program is a RAM hog, making it cripplingly slow, even on machines with 4GB of RAM.
** The toolbars are nowhere near as customizable as previous versions, leaving you with the "ribbon" at the top, which takes up a good portion of your screen.
** Many of the shortcuts have been eliminated. If you're the kind of person who likes to use a keyboard instead of the mouse, you're out of luck.
** Many of the features have been renamed, but the Help feature doesn't help you with this at all. It would have been nice to go to the help menu, type in the name of the feature you want to use, and have it give you the name of the new feature. If you had a function that you used in a previous version, you have to figure out what the new version calls it.
** Many features have been shuffled around, too. Using Microsoft's unusual naming and categorizing strategy, you have to figure out where your features went, which is especially difficult if you aren't sure what the new version calls it, or if it was removed entirely.
*** For example, in Office 2003 and previous versions, if you wanted to edit the header or footer, you would go to edit, then header/footer. In Office 2007, if you want to edit the header or footer, you have to go to "Insert" then "header/footer."
** Like previous versions, if you do a lot of typing in Word, you're going to spend most of your time looking at the bottom of the screen. The only way to avoid this is to continuously scroll up.
* Microsoft's own spellchecker doesn't recognize Microsoft's own words, such as [=PowerPoint=] unless you syntax it right.
* Games for Windows Live was Microsoft's attempt to take on UsefulNotes/{{Steam}}, and given Microsoft's sheer resources, many thought they would have some success in this. To put it lightly, however... they didn't. Poor design choices all around meant that it never attracted many users, and was eventually discontinued in August 2013, to be replaced by an integrated app store in Windows 8 -- and for all that OS's faults, the app store sensibly decided to target casual games, a market which Steam and the others haven't exploited as much -- until Windows 10, that is. As for why the GFWL marketplace was discontinued:
** To install a game, you had to have enough room to store ''three entire copies'' of that game on your hard drive[[note]]One for the installation archive, one for the extracted archive, and one for the actual installed copy; it does not delete these things between steps, only at the end[[/note]]. For some games, that's well over 30 gigabytes (and by the end of the store's life, could reach ''100 gigabytes''). Contrast with Steam, which requires enough room to store ''one'' entire copy. You know, the copy that you ''actually use''.
** Also, while Steam will automatically update your games for you so that you never have to worry about not having your game up to date, Games for Windows Live would only tell you it needed to update a game when you actually tried to play it. You know, the exact moment when you don't want to wait several minutes for your game to be ready.
*** Not to mention that it's a crapshoot over whether you can actually get that update -- DLC can be denied to a player for no reason whatsoever. ''VideoGame/RedFaction: Guerrilla'''s multiplayer mode was rendered entirely unplayable until the developers, like many others, updated the game years after the fact to integrate it with (who else) Steam instead, because GFWL basically decided that faking an update and then slowing the game to a crawl rather than finding whatever it was looking for sounded like jolly good fun.
** Another terrible aspect of GFWL is that a lot of games using it have their save games locked down in a way that makes you essentially lose them every time you reinstall a game or try to transfer your progress to another system. Again this compares unfavorably to Steam, which either just keeps out of the way of screwing with save-files in the first place, or, with Steam Cloud, outright embraces transferring them to different systems or installations.
** Also, if you somehow registered for the wrong country, there was no way to go back and change it. At all, not even with customer service. The only solution was to create another account with a different name (and lose everything in the previous one, of course).
** It's so bad that Microsoft ''blacklisted its own program'' in Windows 8.1. If you attempt to install GFWL, it will ask "This program has compatibility issues, do you want to run it?"
* Unfortunately for gamers, once Microsoft attempted supporting hardcore gamers again on PC on Windows 10, said gamers discovered that the Microsoft Store is an absolute ''disaster''. No offline play, issues downloading games, and now requiring patches to play a game at all, even in single player, and the way downloading patches works essentially means you need to have enough space on your hard drive to be able to ''download the entire game again'' just to patch it, among other issues. Some gamers are claiming that these issues make GFWL look ''competent by comparison'', with many considering it the second incarnation of GFWL. No wonder PC games such as ''VideoGame/QuantumBreak'' and ''VideoGame/RiseOfTheTombRaider'' have sold terribly on the Microsoft Store, in the realm of ''single-digit percentage'' of total sales with the rest being the Steam versions.
** Then there's the UWP format, which is essentially an attempt to replace the standard [=Win32=] executable. Except not only is it severely locked down and limited as far as PC gamers are concerned, the way the DRM works also clearly wasn't thought through well enough -- ''Forza Horizon 3'' at least has a DRM system involving encrypting and decrypting sections of the game data in real-time when said data is needed. Problem is, this is a massive performance hog, and doesn't mesh well with an open-world driving game that loads sections of its game world on the fly, resulting in massive performance drops.
* Versions of Windows Media Player randomly crash with a cryptic message about "server execution" failing. Even when you're trying to play a simple .wav file on your computer with no DRM and network sharing disabled, and there's no conceivable reason it would even need to access a remote server in the first place.
** Also, Windows Media Player started as a very simplistic, lean-and-clean little software that did its basic job (of playing multimedia files), and did it decently well. Then at some point each new version of it added more and more eye candy to it, increasing its resource consumption and decreasing its usability, up to a point where it became almost unusable. This is the reason why Media Player Classic was made, actually: to give people the good old WMP without the cruft. The Windows 7 version of WMP backed up a bit on the eye candy and cruft and is at least barely usable.
* Absolutely any Windows installer that ignores the default install directory in the registry and tries to put the app in C:\Program Files regardless belongs on this page, but especially if:
** The documentation advises against putting the app in Program Files (because, say, it tries to save its settings in its own directory, multiple users be damned).
** The installer tries to put the app in Program Files even when you tell it to put it somewhere else.
** The installer refuses to work at all if you don't have a C: drive.
** The program is 32-bit, and puts itself in the Program Files folder instead of Program Files (x86) on a 64-bit computer.
** Perhaps most infuriatingly, telling the program to install somewhere else will cause it to install some of itself in the specified location, and the rest in "Program Files" anyway. Bonus points if it only makes a token effort, putting a couple megabytes where you told it and hundreds in the default location.
* When trying to make a Windows 8 boot disk, it will fail unexpectedly. Why? Because the screensaver kicked in after minutes of non-activity and messed up the whole process.
* With the development of Windows 10, Microsoft released an update for older operating systems that does nothing except pester users to get the new OS. This would merely be annoying if it didn't cause the occasional error, slow down boot time, and use lag-inducing amounts of processing power. All for the purpose of informing users of Windows 7 and 8 about an OS that hasn't even been released yet.
** Even after getting rid of the update that caused those annoying pop-ups, [[TheThingThatWouldNotLeave they keep coming back]]! Microsoft is now becoming more and more aggressive in their campaign for 1 billion machines with Windows 10 installed. Rather than annoying users relentlessly, they now trick and force users into installing it with misleading pop-ups, users consent or not. When angry users complained about Microsoft's unethical and devious tactics, they played it it off as unfortunate update errors or blamed it on the users. [[http://en.rocketnews24.com/2016/05/20/windows-10s-annoying-updates-come-alive-in-twitter-artists-cute-yet-horrifying-manga/ Mercilessly parodied here.]]
** In its relentless drive to get users to upgrade, Microsoft offered Windows 10 for free to users of Windows 7 and 8 as an optional update through Windows Update. Many users have Windows Update set to "download optional updates and ask before installing". Cue ''4 GB'' of data being delivered to users in countries with low data caps or on netbook type devices with minimal onboard storage. Regular Windows updates are several orders of magnitude smaller.
** Windows 10 installs updates by default, leaving home users with no way to defer or refuse defective updates. Before launch, Microsoft claimed they would perform rigorous testing of each new updates through their [[PerpetualBeta fast track program]]. Long story short, the first update that left users with an unusable computer occurred ''before launch'', as the beta phase was winding down.
** If you're on a previous version of Windows, notice it installing the "upgrade to Windows 10!" nag screens, go "hell no you don't" and delete the applet and registry keys (probably using GWX Control Panel) after it's downloaded and started the preinstall but before rebooting and letting it finish, the system will become broken beyond rescue and will need a full reinstall.
* If you thought GUI programming in general was bad (see below), well, Microsoft has you covered. Windows has ''five'' separate GUI toolkits ([=Win32=], MFC, Forms, WPF and Metro). [=Win32=] is the one all others are implemented on top of, but its (never used nowadays) default settings produce UI that look like the ones in ''Windows 3.1''. MFC is a C++ library on top of it, but was designed before C++ was fully standardized and comprises numerous ugly hacks such as DIY exception management and classes that ''simulate'' diamond inheritance. Windows Forms is the much cleaner .NET-based equivalent and WPF is supposed to be the revolutionary new interface for the Windows desktop, but Microsoft roughly alternates between apparently abandoning each one. And then comes Metro, better known as "who put a smartphone UI in my desktop?".
** And the real killer with this is that because Forms and WPF are designed around C# and the .NET CLR, there is ''no'' up-to-date native code UI toolkit for Windows. This has resulted in a proliferation of third-party toolkits (all implemented atop [=Win32=]).... which in turn ties down MS to keeping those old toolkits around, because there's so much software using old third-party toolkits that require the old interfaces to be maintained bug-for-bug.
*** Microsoft appears to be trying to fix this with the introduction of the native code C++/CX and UWP in Windows 10. In fact, the aggressive pushing of the Windows 10 upgrade likely to be partly motivated to maximize the availability of this system to developers.
** It's also kind of bad if you want to program in Ruby. Or Python. Or Factor. Or any language whose low-level libraries are written in C, which is practically all of them. Because of the awkwardness of the Windows native UI libraries, they tend to use cross-platform UNIX-based ones; which often have very badly maintained Windows ports, and again produce software that looks more than 10 years old without extensive retooling. (Java is the one exception.)
** Did we mention that all the work to transform the programmer's use of MFC/Forms/WPF etc into [=Win32=] is done by your computer and your CPU, every time any program you ever run under Windows does anything?
* For a year or so, Windows 10 had multiple bugs and slow-downs until Microsoft incrementally updated these problems as a token of good will. These annoyances (along with many new mandatory features) did not help Window 10's negative image in the eyes of the owners of previous editions.\\
On startup Windows 10 detects a problem with the hardware. [[LoadsAndLoadsOfLoading After making a slowass scan of the whole system]], it invites the user to click on the repair button, except the mouse and the keyboard aren't responding. With no way initiate repairs, the OS is utterly useless.\\
The System Restore feature has also been problematic historically. If you ever need to roll back the system, there is no indication that the available restore points are usable until you attempt to use it. Maybe it is anti-virus software misbehaving, but it can be like flipping a coin to see if your restore point works. Chances are, you've tried rolling back the system only for the system to reboot and Windows to notify that your system has '''not''' been changed and the restore point failed. Also ridiculous is when the system reports a failure to restore, but rebooting the system will reveal that the restoration was fine. Additionally, using system restore after a major service pack update may actually ruin the system, requiring an installation DVD for a clean install (your files are moved into a directory unharmed).\\
You may as well just ignore System Restore and just use ''Control Panel\System and Security\Backup and Restore (Windows 7)'' to save a drive image to a removable HDD/flash-stick and create an emergency boot DVD. Use this combo for when your computer goes insane; there is too much that can go wrong with System Restore.
* As of version 7.18, Skype has a tendency to crash whenever the user tries to copy something out of a chat window, or even when they try to browse through folders! Windows Explorer being a completely separate process which shares no resources whatsoever with Skype, this last one is particularly inexcusable. It's never a good sign when the official solution for a problem is "switch to an older version."
* Although solid state hard drives render file fragmentation a moot point, on conventional hard disks, the anti-fragmentation routines of NTFS are not very forward-thinking. When NTFS tries to place new files, it does place them in a location where they tend to be contiguous, but files that undergo alterations in size tend to fragment.
** On top of this, Microsoft were so convinced that NTFS was immune to fragmentation that the Windows NT line prior to Windows 2000 ''didn't include a defragmenter''. Never mind that a lot of NT users installed it on drives with FAT or HPFS file systems in order to maintain compatibility with Windows 3.x/9x and OS/2 systems.
* Windows 10 has a tendency to install apps without the user's consent, and no way to disable this feature other than [[http://winaero.com/blog/fix-windows-10-installs-apps-like-candy-crush-soda-saga-automatically/ changing the damned registry.]]
* Windows 10's start menu search got a little... overzealous... with its internet integration, to the point where web results are frequently prioritised over things readily found in major directories on your own hard drive, so good luck trying to remember where you left that file — Cortana sure as hell isn't lifting a finger to help. The Windows Explorer system search fortunately still works for finding mislaid files, but, it having to methodically search your entire computer, can and will take longer than you want to wait for it.
* The revamped version of the Windows Calculator included with Windows 10 will randomly softlock completely out of the blue. It still looks fine, it doesn't say "Not Responding" when you click on it, even Task Manager still says it's running okay. It will simply stop responding to any and all inputs and have to be closed and restarted.
* Windows Aero (now Metro) may look cool, but it's far from perfect. First off, there's many programs (especially ones that make use of transparency layers) that flat out do not display correctly if Aero is turned on. Not to mention it drains performance on much more weaker machines that barely even support it. And if you're a laptop user, it ''will'' hog your battery.

[[folder:Sony - Functionality is Make Believe]]
* As discovered in the months when the [=PS3=] was hacked, some of the code in the system involving signing software was discovered to use a constant integer for all systems, which was barely obfuscated at all. Some simple algebra was all it took to get it.[[labelnote:Click here for a full explanation]]In cryptography, a nonce, or "'''n'''umber used '''once'''", is used to prevent replay attacks by making the same plaintext encrypt to a different ciphertext each time. This makes it impossible for an attacker to save a previous ciphertext and resend it a second time, since it will no longer be valid on the second try. So naturally, some Einsteins at Sony wrote their signing code to always use the exact same nonce, rendering the nonce a moot point entirely.[[/labelnote]]
* The Sony rootkit designed to install whenever a user placed an audio CD in their computer. Ironically, this wouldn't get installed on many user's computers because ''it required administrative privileges to install'', and a safe setup will deny these privileges to prevent ''just this kind of software from installing''. On top of that, the rootkit installed on [=AutoPlay=], which means (on Windows XP and earlier from before [=AutoPlay=] was changed to be prompt-only) you could defeat it by, on top of disabling [=AutoPlay=] altogether, ''holding the shift key'' when you insert the disc. If the rootkit ''did'' manage to get itself installed on a ''paying customer's'' computer, it would slow down the CPU ''and'' open up gigantic security holes that would invite (additional) malware.\\
Not helping matters was Sony ''taking down posts'' from bands (including Music/{{Switchfoot}}) made to help fans who wanted to bypass the rootkit, and when then-Sony BMG president was asked about the whole ordeal, he actually said "[[WhatAnIdiot Most people don't even know what a rootkit is, so why should they care about it?]]". Sony later released a program that would supposedly remove the rootkit, but only installed ''more'' crap. And to download it required submitting a valid e-mail address, which Sony was free to sell to spammers. It's perhaps the greatest example of an anti-piracy action that does nothing to discourage piracy, and, indeed, punishes people for a legitimate purchase. All this in the name of CopyProtection. Because DigitalPiracyIsEvil, and committing destruction of property on computers is apparently better than possibly letting them copy a CD they've legally bought.
** In a similar vein was BMG's [[https://en.wikipedia.org/wiki/MediaMax_CD-3?scrlybrkr=07945ee1 MediaMax CD-3]] installer, which was implemented and scrutinized around the same time as the Sony scandal (coincidentally, Sony would later acquire BMG). Sony's rootkit at least had the courtesy of asking whether or not you wanted to install it if it loaded; this one went a step further, as not only did it have nearly all the same features as Sony's rootkit, it ''installed whether or not you declined the installation!'' Thus, not only did it operate as a rootkit, it was downright ''malware''. Because [[SarcasmMode that's what you want when purchasing a CD, isn't it?]] Some bands like My Morning Jacket even offered to burn DRM-less copies of their albums affected by the malware for fans.
* Another idiotic Sony DRM idea: [[https://en.wikipedia.org/wiki/Key2Audio Key2Audio]], a DRM system that worked by violating the Red Book Compact Disc standard and putting a dummy track claiming the disc was empty around the outer edge of the disc (which is read first by PC disc drives, while stereos read from the inner track first). The trick to breaking this one? Keep the outer track from being read. How to do that? ''Draw over the edge with a permanent marker.''
* Still Sony: the UsefulNotes/PlayStation3. A firmware bug in which some models believed that 2010 was a leap year resulted in lockouts on ''single player'' games due to the machine refusing to connect to the [=PlayStation=] Network. What was the reason for this system having such a perilous dependency on the judgement of its system clock? [[http://lpar.ath0.com/2010/03/02/8001050f/ DRM!]]
** The bug stemmed from the hardware using binary-coded decimal for the clock. Because apparently converting that time for display is ''so'' difficult for the ''eight core Cell processor''.
* And another Sony facepalm to add to the pile: Sony releases frequent updates for their UsefulNotes/PlayStationPortable, mostly in an attempt to fix "security holes" that would allow running "homebrew" applications in the name of preventing piracy. On one occasion a fix for an exploit that would allow such "unauthorized" code to run with the use of a specific game ended up opening an exploit that required no game at all.
** Sony tried to do the same with the [=PlayStation=] 3, in addition to numerous other security features such as the Hypervisor and the Cell Processor's SPE Isolation. As the hacking group [=Fail0verflow=] (the same guys responsible for the major Wii breakthrough) discovered, the only bits of security that are actually implemented well are usermode/kernelmode, per-console keys, and the "on-die boot ROM" -- everything else was either bypassed or broken through. [[https://www.youtube.com/watch?v=hcbaeKA2moE&t=37m0s This includes the public-key cryptography.]] Yes, the cryptography in the [=PS3=] used to check the signature on software was cracked, and Sony's private keys (which are used to sign software for the [=PS3=]) were obtained.
* In a possibly related hack, custom firmware enabled hackers to obtain free games and DLC from the PSN store. Why? Sony made the classic mistake of trusting the client software and assuming a certain variable in the [=PS3=]'s firmware could never be modified.
* The first generation [=PlayStation=] 4 had an issue with the capacitive sensor in its disc eject button. Over time it tends to spuriously activate, leaving the console spitting out discs and then refusing to allow them back in, making it impossible to play any game that comes on a disc. The only official Sony solution? Replace the entire console under warranty, or buy a new one. Never mind that simply adding a software patch allowing the user to tell the software not to pay attention to the hardware button would be easy and fine (it's not a mechanical system and when you want to eject a disk it can be done through the system menus). Sony replaced it with a mechanical button on all later versions.
* Among the many marketing and management screwups that doomed the Minidisc in classical Sony fashion, the few technical issues of the format tend to get lost in the noise -- but there is one irritation that plagued Minidisc from the start. If your portable recorder was bumped while writing the TOC -- that is, the partition table, as opposed to the actual data on the storage part of the disk -- and managed to burn bad data into it, it would render the disk completely unusable. Players wouldn't recognize it anymore and it was impossible to re-record the bad TOC with a good one, despite the rewritable nature of the media. The kicker: the inability to rewrite a good TOC wasn't a hardware limitation, it's just that ''nobody thought of programming it into the recorders'', whose mechanisms were otherwise perfectly capable of doing it. This is evident because there are a small number of early models -- rare and expensive even then, let alone now -- that do have a TOC-restoring feature -- ''hidden in the debug menu'', of course, because why would any normal user ever need that?
* The Sony Vegas line (which has since changed hands to MAGIX) is a fairly popular video editor, even surpassing the likes of Premier or Final Cut in terms of functionality. But one flaw it's had for the longest time is how it draws waveforms. Every file you drop in to the program, it drops an SFK file in the same directory. And these SFK files can vary in file size depending on how long that file is. A simple sound effect, for example, takes on average ''one kilobyte'' just to draw a waveform. So if you're an avid video editor, these SFK files ''will'' clog your hardrive if you don't keep up with them.

! Multiple Brands, Sorted By Category

[[folder:Alternate Operating Systems: Bugs At No Extra Charge]]
A free or low-cost operating system is great, but bad programming (or none at all) can ruin the fun quickly.
* Linpus Linux Lite, as shipped with the Acer Aspire One. Now in fairness to Linpus, its GUI could not possibly be more intuitive (plus a boot time of just 20 seconds and recognizing out of the box xD picture cards as well as others beyond SD ones, if the computer has a reader that supports them). But there is designing a distro for complete beginners, there is designing a distro with several directories ''hard-coded'' to be read-only and Add/Remove Programs accessible only by some fairly complex command-line tinkering. That the sum total of its official documentation is a ten-page manual that contains no information that can't be figured out by an experienced user within five minutes of booting doesn't help as well as that updating it was a hell.
** In Brazil, many low-end computers are sold with horrible Linux distros in order to claim tax breaks for using locally developed software. Stuff which cannot be updated without major breakage, full of security holes, old versions of packages, and so on, to the point that it seems many people only buy them so they can install a pirated copy of Windows to save money. Same in Hungary because of a law prescribing that no computer can be sold without an operating system (the other loophole is paying the computer store for parts and assembly, so on paper not buying a computer).
*** Same in Italy and lots of countries in South Asia, where identical laws have caused computers -- modern ones, with multicore processors and several gigabytes of RAM -- to be sold with ''[=FreeDOS=]'', although the BIOS or UEFI is more than capable of booting any proper OS installation disc.
* While Linux is sometimes described as the most stable OS, the opposite was true for the 1.1 kernel, which was notoriously unstable and bug-prone, and had poor back-compatibility. Users quickly reverted to version 1.0 and waited for 1.2. This is the reason behind the convention of using odd version numbers only for beta releases.

[[folder: Automotive Annoyances]]
It's times like these that may make you wonder if you're better off driving a trusty classic car.
* Automobiles may have internet connectivity, which means a hacker may amuse himself by [[http://www.wired.com/2015/07/jeep-hack-chrysler-recalls-1-4m-vehicles-bug-fix/ forcing you to lose control of the vehicle]]. [[http://www.wired.com/2015/07/hackers-remotely-kill-jeep-highway/ Maybe that vintage car is worth driving for now.]] Yes, this makes it possible to attempt murder with un-patched vehicles, no joking here. Thankfully, White Hats worked with Chrysler to prevent this security vulnerability.
* One of the issues with modern cars (especially luxury models) can be the various computer systems that can cause error codes register on an O.B.D. 2 scan tools. The more computer parts involved in the vehicle's construction, the more likely something will eventually malfunction. [[https://www.youtube.com/watch?v=UsCwse54Tmo Connector corrosion is a possible issue]] and can be enough to prevent your vehicle from even starting due to an electronic engine part malfunctioning. Some of these computer parts can experience a bad production run, and fail unexpectedly, leading to issues such a traction control system going into "fail-safe" due to breakage.
* The 2011 Jeep Grand Cherokee was a hardware equivalent of an ObviousBeta and also a stand-out [[TheAllegedCar alleged car]], due to the a device called a "totally integrated power module" (T.I.P.M.). The complaints are so frequent that [[http://www.carcomplaints.com/Jeep/Grand_Cherokee/2011/ Car Complaints has issued a waring to avoid this model like the plague.]] Dangerous experiences include the fuel pump shutting off in transit, leading to power steering and brakes shutting off and a crash. Thankfully, Chrysler got a grip on the problem, and by the 2013 model, the complaints had been drastically reduced. For comparison, the 2010 model had '''vastly''' fewer complaints.

[[folder:Consoles That'll Leave You Inconsolable]]
Consoles are meant to be the more user-friendly alternative to gaming on home computers. Unfortunately, that's not always the case in practice.
* Some more examples courtesy of Sony: (Are Microsoft and Sony having a "anything you can screw up, we can screw up worse" competition or something?)
** First, the first batch of [=PS2=]'s were known for starting to produce a "Disc Read Error" after some time, eventually refusing to read any disc at all. The cause? The gear for the CD drive's laser tracking had ''absolutely nothing'' to prevent it from slipping, so the laser would gradually go out of alignment.
** The original model of the PSP had buttons too close to the screen, so the Einsteins at Sony moved over the switch for the square button, without moving the location of the button itself. Thus every PSP had an unresponsive square button that would also often stick. Note that the square button is the second-most important face button on the controller, right before X; in other words, it's used constantly during the action in most games. Sony president Ken Kutaragi confirmed that ''this was intentional'', conflating this basic technical flaw with the concept of artistic expression.
--->'''Ken Kutaragi:''' I believe we made the most beautiful thing in the world. [[{{Metaphorgotten}} Nobody would criticize a renowned architect's blueprint that the position of a gate is wrong.]] [[InsaneTrollLogic It's the same as that]].
** And before you ask, yes, that's a real quote sourced by dozens of trusted publications. The man ''actually went there''.
** Another PSP-related issue was that if you held the original model a certain way, the disc would spontaneously eject. It was common enough to be a meme on YTMND, and several ''VideoGame/HalfLife2'' {{game mod}}s also had a PSP that shot exploding discs as a weapon.
** The original UsefulNotes/PlayStation wasn't exempt from issues either. The original Series 1000 units and later Series 3000 units (which converted the 1000's A/V RCA ports to a proprietary A/V port) had the laser reader array at 9 o'clock on the tray. This put it directly adjacent to the power supply, which ran exceptionally hot. Result: the reader lens would warp, causing the system to fail spectacularly and requiring a new unit. Sony admitted this design flaw existed... ''after'' all warranties on the 1000 and 3000 units were up and the Series 5000 with the reader array at 2 o'clock was on the market.
* The infamous "Red Ring of Death" that occurs in some UsefulNotes/{{Xbox 360}} units. Incidentally, that whole debacle was blown way out of proportion, no thanks to the media (it's important to note, however, that Microsoft released official numbers stating that ''51.4%'' of all 360 units were or would eventually be affected by the issue listed below). While there were an abnormally high number of faults, once the media reported the problem, people everywhere started sending back perfectly functional consoles – for the record, only "three segments of a red ring" meant "I'm broken, talk to my makers". Other red-ring codes could be as simple as "Mind pushing my cables in a bit more?", something easy to figure out if you ReadTheFreakingManual.
** However, the design flaw that led to the fatal [=RRoDs=] was at the very least a boneheaded decision on Microsoft's part. Basically, the chip containing the graphics core and memory controller got exceptionally hot under full load, and was only cooled by a crappy little heatsink. This led to the chip in question actually ''desoldering itself'' from the motherboard after a while, and people who opened up the cases on dead units actually reported the chip falling out of the console after removing the heatsink.
*** The heat sink was a lot larger but shrunk to make room for the DVD drive, and apparently no more significant testing was performed after making a risky design choice like that. A more plausible explanation is that the solder joints weren't very reliable under repeated thermal stress, and eventually they crack. The same thing happens to first generation UsefulNotes/PlayStation3 models (the Yellow Light of Death), albeit much later. Whoever was commissioned to do the assembly of both the 360 and [=PS3=] must've had a grudge.[[note]]A common explanation is that no one knew how to use or handle lead-free solder. It was new at the time, you see. Although, there is some blame to be assigned to the crappy solder...[[/note]]
** The 360 has another design flaw in it that makes it very easy for the console to scratch your game discs if the system is moved while the game disc is still spinning inside the tray. The problem is apparently so insignificant amongst most Xbox 360 owners (though ironically MS themselves are fully aware of this problem), that when they made the Slim model of the system they fixed Red Ring issues (somewhat) but not the disc scratching issue.
*** Most mechanical drives can tolerate movement at least. It's not recommended (especially for hard drives, where the head is just nanometers away from the platter), but not accounting for ''some'' movement is just bad. Anyone that has worked in a game-trading industry (such as Gamestop/EB Games) can tell you that not a day goes by without someone trying to get a game fixed or traded in as defective due to the evil Halo Scratch.
*** Microsoft recommends to not have the original UsefulNotes/XboxOne model in any position other than horizontal because the optical drive isn't designed for any orientation other than that. Note that ''every'' 360 model was rated to work in vertical orientation, even with the aforementioned scratching problem, and Microsoft quickly restored support for vertical orientation with the updated Xbox One S model.
** Most of 360's problems stem from the inexplicable decision to use the full-sized desktop DVD drive, which even in the larger original consoles took almost a quarter of their internal volume. Early models also had four rather large chips on the motherboard, due to the 90-nm manufacturing process, which also made them run quite hot (especially the [[UsefulNotes/GraphicsProcessingUnit GPU]]-[[UsefulNotes/VideoRAM VRAM]] combo that doubled as a northbridge). But the relative positions of the GPU and the drive (and the latter's bulk) meant that there simply wasn't any room to put any practical heatsink! Microsoft tried to address this problem in two separate motherboard redesigns, first of which finally added at least ''some'' heatsink, but it was only third, when the chipset shrinking to just two components allowed designers to completely reshuffle board and even add a little fan atop the new, large heatsink, which finally did the problem away somewhat. However even the Slim version still uses that hugeass desktop DVD-drive, which still has ''no'' support for the disk, perpetuating the scratching problem.
** The circular D-Pad on the 360's controller (likewise for the Microsoft [=SideWinder=] Freestyle Pro gamepad), which is clearly designed to look cool first and actually function second. Anyone who's used it will tell you how hard it is to reliably hit a direction on the pad without hitting the other sensors next to it. The oft-derided [[DisneyOwnsThisTrope U.S. patent system]] might be partially responsible for this, as some of the good ideas (Nintendo's + pad, Sony's cross pad) were "taken". Still, there are plenty of PC pads that don't have this issue to the same degree... or, there were, until every third-party pad [[FollowTheLeader started ripping off the 360 one wholesale]], unusable D-pad and all. Even third-party pads designed specifically for use on the [=PS3=] still look exactly like a 360 pad - yes, still with the unusable D-pad - just with the PS shapes instead of the 360's letters marked on the face buttons.
** Early in the life of the 360, many gamers used the console's optional VGA cable to play their games with HD graphics, as true [=HDTVs=] tended to be rare and expensive back then. PC monitors at the time usually had a 4:3 aspect ratio, which most game engines were smart enough to handle by simply sticking black bars at the top and bottom of the screen, with a few even rendering natively at the right resolution. However, some engines (including the one used for ''VideoGame/NeedForSpeed Most Wanted'' and ''Carbon'') instead rendered the game in 480p -- likely the only 4:3 resolution they supported -- and upscaled the output. Needless to say, playing a 480p game stretched to a higher resolution[[note]](1280x1024 being the most common resolution for mid-range monitors at the time -- which you may notice is actually a 5:4 aspect ratio instead of 4:3, meaning that you had pixel distortion to deal with on top of other things)[[/note]] looked awful, and arguably even worse than just playing it on an SDTV. Some console games with extremely-similar simultaneous PC releases, like ''VideoGame/FarCry2'', had to be patched to add a proper widescreen option for the PC versions because players with multiple monitors were dealing with an image so vertically compressed by the "fake" widescreen mode that they couldn't even see the game to play it.
** The original 360's Optical Audio port was built into the analog video connector. If you wanted to utilize both HDMI video and Optical audio, the hardware supported both simultaneously. The ports, however, were placed too close together and the bulky analog connector prevented inserting an HDMI cord. Removing the plastic shroud on the analog connector allows you to use both at the same time.
* Nintendo have made their fair share of blunders over the years as well:
** When Nintendo of America's engineers redesigned the Famicom into the UsefulNotes/NintendoEntertainmentSystem, they removed the pins which allowed for cartridges to include add-on audio chips, and rerouted them to the expansion slot on the bottom of the system in order to facilitate the western counterpart to the in-development Famicom Disk System. Unfortunately, not only was said counterpart never released, there was no real reason they couldn't have run audio expansion pins to both the cartridge slot and expansion port, other than the engineers wanting to save a few cents on the necessary IC. This meant that not only could no western NES game ever have any additional audio chips, it also disincentivised Japanese developers from using them, as it would entail reprogramming the entire soundtrack for a western release.
** The UsefulNotes/VirtualBoy was a poorly-designed console in general, but perhaps the strangest design flaw was the complete absence of a head-strap. While this was ostensibly because of fears that the weight of the device could cause neck strain for younger players, for one thing pre-teens weren't officially supposed to be playing the device anyway, and for another thing the solution they came up with was a fixed 18-inch-tall stand that attached to the underside of the system. This meant that if you didn't have a table that was the ''exact'' right height, you'd likely end up straining your neck and/or back anyway, in addition to the eye strain that the system was notorious for. Even the UsefulNotes/RZone, a notoriously poor ShoddyKnockOffProduct of the system, managed to make room for a head-strap in the design.
** The Wii literally has no crash handler. So if you manage to crash your system, you open it up to Arbitrary Code Execution, and a whole load of security vulnerabilities awaits you. Do you have an SD card inserted? Well, crash any game that reads and writes to it, and even ''more'' vulnerabilities open up. They'll tell you that they've fixed these vulnerabilities though system updates, but in reality, they never did. In fact, the only thing these updates did on that matter is simply remove anything that was installed ''with'' these vulnerabilities - nothing's stopping you from using these vulnerabilities again to ''re''-install them.
** While the UsefulNotes/WiiU ultimately proved a failure for several reasons, poor component choices helped contribute to its near-total lack of third-party support. It'd only be a slight exaggeration to say that the system's CPU was essentially just three Wii [=CPUs=] -- and by extension, three ''UsefulNotes/GameCube'' [=CPUs=] -- heavily overclocked and slapped together on the same die, with performance that was abysmally poor by 2012 standards. Its GPU, while not ''as'' slow, wasn't all that much faster than those of the [=PS3=] and Xbox 360, and used a shader model in-between those of the older consoles and their successors, meaning that ported [=PS3=]/360 games didn't take advantage of the newer hardware, while games designed for the [=PS4=] and Xbox One wouldn't even work to begin with due to the lack of necessary feature support. The system would likely have fared much better if Nintendo had just grabbed an off-the-shelf AMD laptop APU -- which had enough power even in 2012 to brute-force emulate the Wii, eliminating the main reason to keep with the [=PowerPC=] line -- stuffed it into a Wii case and called it a day. Fortunately Nintendo actually seem to have learned from this, basing the UsefulNotes/NintendoSwitch on an existing [=nVidia=] mobile chip which thus far has proven surprisingly capable of punching above its weight.
* After insulting the childishness of the [[UsefulNotes/GameBoyAdvance GBA]] through PR, Nokia created the complete joke of a design that was the original N-Gage. As a phone, the only way you could speak or hear anything effectively is if the user held the thin side of the unit to his/her ear (earning it the derisive nickname "taco phone" and the infamous "sidetalking"). From a gaming point of view, it was even worse, as the screen was oriented vertically instead of horizontally like most handhelds, limiting the player's ability to see the game field (very problematic with games like the N-Gage port of ''[[VideoGame/SonicAdvanceTrilogy Sonic Advance]]''). Worst of all, however, is the fact that in order to change games, one had to remove the casing and the battery every single time.
** During the development of the N-Gage, Nokia held a conference where they invited representatives from various game developers to test a prototype model of the N-Gage and give feedback. After receiving numerous suggestions on how to improve the N-Gage, Nokia promptly ignored most of them on the grounds that they were making the machines through the same assembly lines as their regular phones and they were not going to alter that.
* As far as bad console (or rather, console add-on) design goes, the all-time biggest example is probably the UsefulNotes/AtariJaguar CD. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Sega Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the disc motor to break apart internally from its fruitless attempts to spin the disc. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective.\\\
Of note, it was not rare for the device to come fresh from the box in such a state of disrepair that ''highly trained specialists'' couldn't get it working – for example, it could be ''soldered directly to the cartridge port'' and ''still'' display a connection error. This, by the way, is exactly what happened when Creator/JamesRolfe tried to review the system.
** While the UsefulNotes/Atari5200 wasn't that poorly-designed of a system in general -- at worst, its absurdly huge size and power/RF combo switchbox could be annoying to deal with, but Atari eventually did away with the latter, and were working on a smaller revision when the market crashed, forcing them to discontinue the system -- its controllers were a different matter entirely. In many ways they were ahead of their time, with analogue movement, along with start and pause buttons. Unfortunately, Atari cheaped out and didn't bother providing them with an auto-centring mechanism, along with building them out of such cheap materials that they usually tended to fail after a few months, if not ''weeks''. The poor-quality controllers subsequently played a major part in dooming the system...
** ...which brings us right back to the Jaguar. The controller for that system only included three main action buttons, a configuration which was already causing issues for the UsefulNotes/SegaGenesis at the time. In a baffling move, the controller also featured a numeric keypad, something that Atari had last done on the 5200. On that occasion the keypad was pretty superfluous and generally ignored by developers, but it was only taking up what would probably have been unused space on the controller, so it didn't do any harm by being there. The Jaguar's keypad, on the other hand, was far bigger, turning the controller into an ungodly monstrosity that has often been ranked as the absolute worst videogame controller of all-time.[[note]]Its main competition coming ironically from the 5200 controller, though that one's more often given a pass on the grounds that it ''would'' have been decent if Atari hadn't cut so many corners, and that by the early 90s the company really should have known better[[/note]] Atari later saw sense and produced a revised controller that added in three more command buttons and shoulder buttons, but for compatibility reasons they couldn't ditch the keypad, meaning that the newer version was similarly uncomfortable.
* The UsefulNotes/SegaSaturn is, despite its admitted strong points on the player end, seen as one of the worst major consoles internally. It was originally intended to be the best 2D gaming system out there (which it was), so its design was basically just a 32X with higher clockspeeds, more memory, and CD storage. However, partway through development, Sega learned of Sony's and Nintendo's upcoming systems (the [=PlayStation=] and Nintendo 64 respectively) which are both designed with 3D games in mind, and realized the market – especially in their North America stronghold – was about to shift under their feet; they wouldn't have a prayer of competing. So, in an effort to try to bring more and more power to the console, Sega added an extra CPU and GPU to the system. Sounds great! Until you consider that there were also ''six other processors'' that couldn't interface too well. This also made the motherboard prohibitively complex, being the most expensive console at the time. And lastly, the GPU's basic primitive had four sides (the industry standard was three). This made multiplatform games tricky to work with on the Saturn. Ironically, consoles with multiple CPU cores would become commonplace two generations later with the Xbox 360 and [=PlayStation=] 3; like a lot of Sega's various other products of that era, they had attempted to push new features before game developers were really ready to make use of them.
* The Kickstarter-funded UsefulNotes/{{Ouya}} console has been identified as having a huge raft of bad ideas:
** As with many console systems, Ouya gives the user the option of funding their account by buying funding cards at retail, which provides codes the user can type in to add money to their account. Unfortunately, the Ouya will not proceed beyond boot unless a credit card number is entered, making the funding cards a pointless option.
** When an app offers an in-app purchase, a dialog is displayed asking the user to confirm the purchase -- but no password entry or similar is required, and the OK button is the default. This means that if an app offers a purchase while you are pressing the button quickly, you may confirm it accidentally and be charged for an in-app purchase.
** The system was touted from the beginning as open and friendly to modification and hacking. This sparked considerable interest, and it became obvious that a sizable part of the supporting community didn't really give two hoots about the Ouya's intended purpose as a gaming console; rather, they just wanted one to hack and make an Android or preferably Linux computer out of. The Ouya people -- who, like every other console manufacturer, counted to make profit more from selling the games than the hardware -- promptly reneged on the whole openness thing and locked the Ouya down tight. The end result was a single-purpose gadget that had a slow, unintuitive and lag-prone interface, couldn't run most of the already-available Android software despite being an Android system, and didn't even have many games that gamers actually wanted to buy.
** Also, the HDMI output is perpetually [=DRMed=] with HDCP. There's no switch to disable it, not even turning on developer mode. People who were expecting the openness promised during the campaign are understandably angry for being lied to, as are those hoping to livestream and record LetsPlay of the games.
** Even in its intended use the Ouya has disappointed its users. The main complaint, which still occasionally appears on /r/ouya when someone gets one second-hand, is that the controllers are laggy. On a console with mostly action-packed casual games, this is ''very'' bad. Apparently it isn't even a fault of the console itself, as the controllers exhibit the same lag when paired to a computer. Opinions differ on whether it was just a large batch of faulty controllers -- apparently not everybody has this issue -- or a design flaw that came out during beta testing but was knowingly ignored and quietly corrected in subsequent batches.
** The fan used to prevent overheating ''isn't pointed at either of the two vents''. Never mind that the console uses a mobile processor, which doesn't even need a fan. In theory, the fan would allow the processor to run at a higher sustained speed. In practice, it blows hot air directly against the wall of the casing, causing frequent issues due to overheating.

[[folder:Digital Rights Mismanagement]]
Hey developers, we paid for your software--you ''will'' show the proper respect.
* ''VideoGame/DragonAgeOrigins'' has a really bad bit with its installer where, when it asks for your CD key, you can use Task Manager (AKA "Ctrl-Alt-Delete") to stop one of the processes that is part of the installer and skip it. Under the DMCA, Task Manager is now illegal, alongside markers and the Shift key.
* Games using [=SecuROM=] V10 or below will also fail to launch (without explanation) if they detect a process named "procexp.exe" (Sysinternals Process Explorer, a program provided by ''Microsoft''), ostensibly out of fear that hackers will use it to reverse-engineer their DRM process, even though Process Explorer is basically just a beefed up version of Task Manager, and Process ''Monitor'' is the one that reveals any action taken by any program whatsoever. There are two ways you can circumvent this:
** Close [=ProcExp=] and reopen it immediately after starting the game.
** ''Rename the [=ProcExp=] executable to anything else.'' Or be running a 64-bit version of Windows, where it's now called procexp64.exe.
* [=StarForce=], another CopyProtection program. It was so poorly written that upon installation, it would open huge holes in security, crash the computer's OS (Windows Vista in particular would require a reinstall) and could break the disk drive's hardware. For example, it would disable SCSI and SATA drives based on the OS' presumption that they were both virtual CD/DVD drives. It doesn't help that the latter is a common connector for newer disc and ''hard drives''.
* In fact the problems of disc-based DRM like Securom, Starforce, and Safedisc made it so that Windows 10 and updates for Windows 7 and 8 in the same time outright not allowing to play games protected by those... except cracked ones (though "legally cracked" games are available in GOG if it's properly licensed).
* Norton was once the ultimate antivirus software, and the standard the others aspired to. Then they decided to focus on anti-piracy. The result: The virus scanner hardly works, and even when it does, the virus discovered cannot be expunged, in most cases. For this, Norton now has the ultimate pirate defense: it's such an awful program, anyone knowledgeable enough to pirate software is going to get the paid, more feature-rich version of a program with a free version, or just go the legal route and get either the free versions of either of those or the newer Microsoft Security Essentials, which are all free, yet somehow still far better than Norton.
** With Norton 360, attempting to create a backup will take you to a section where you have to login with your Norton Account. The problem is, if you don't happen to know the account's password (like if your parents gave you the CD as one of the 3 or so allowed copies of the program), the window's exit and right-click-close buttons are grayed out, and ''Task Manager can't kill the application.'' You have to sign in properly or it declares your copy pirated despite providing the CD Key earlier.
** Many antiviral suites such as Norton, Kaspersky, and [=McAfee=] have a certain yearly subscription which users must pay, or else the antivirus will stop protecting their computer. So far, so uninteresting. However, with many of these suites, when the subscription expires, the antivirus will ''continue scanning every webpage the customer visits, and be unable to clear it for viewing.'' This results in extreme slowdowns and general lack of connectivity, which can only be fixed by either uninstalling the program in question -- or paying more for another year's sub. This is a brilliant marketing tactic right up until you realize that many customers have [=McAfee=], Norton, or Kaspersky on their freshly purchased computers for the rough equivalent of ''one month'', at which point their shiny new piece of hardware goes defunct and they become justifiably outraged by what functionally works as a scam.
* Creator/{{Ubisoft}}'s online DRM, which prevented players from playing their games when the servers went down.
** It's starting to get worse with their [=UPlay=] software, which is simply another copy of Steam, like EA did with Origin. However, whereas Origin started GrowingTheBeard in 2013, [=UPlay=] is nowhere near polished enough for prime time. And it's their main DRM platform now.
** At one point, their [=UPlay=] software had an remote code execution bug. One hacker built a proof-of-concept exploit that would launch Windows Calculator remotely simply by visiting his website.
** Uninstalling a game that uses [=UPlay=] can cause ''massive'' memory leaks that will grind systems to a halt just with regular use -- you have to use system restore to a point before you installed the game at all.
** In 2013, [=UPlay=] was hacked and a version of ''VideoGame/FarCry3BloodDragon'' was downloaded, full and complete, a few months before its official PC release date. This forced Creator/{{Ubisoft}} to release the game around the same time as the console "exclusives".
** On July 30, 2012, it was discovered by Google that [[http://games.slashdot.org/story/12/07/30/1214206/ubisoft-uplay-drm-found-to-include-a-rootkit UPlay is a rootkit]]. Whether or not this behavior has changed remains to be seen.
** Say what you will about EA and Origin, but at least they have the decency to not sell their games on Steam if they want to force people to use their own program instead. You can still buy a Ubisoft game on Steam, but you still need to install [=UPlay=] to play it.
* The PC version of ''VideoGame/GearsOfWar'' had a DRM certificate that expired on January 28, 2009, making the game unplayable after that date. Luckily, Epic released a patch to fix this shortly after.
* [[Creator/ElectronicArts EA's]] smartphone version of ''VideoGame/{{Tetris}}'' requires an Internet connection. For a single-player game. And if your connection drops in mid-game, it'll kick you back to the title screen. Despite the fact that there are no features in the game that should even require Internet connectivity. Fortunately, this is {{subverted}} with the iOS and Android versions.
* The copy protection for ''VideoGame/MeltyBlood Actress Again Current Code'' has a truly insane activation procedure. You have to insert the CD, type in the CD key, and install the game. Then, remove the CD, try to run the game (which will give an error), and click a second button to connect to a website into which you must type a code the game gives you plus a second CD key from the game box. It will then allow you to download a file which you must place in the game's install directory -- which requires going through elevation if you are using Windows Vista or later. Finally, you can then run the game... ack.
** The "type CD key to install part" is especially hilarious, because you don't need to install -- the game is stored as a folder on a DVD itself, unpacked. As many of the Japanese games actually are. You can just copy this folder to any destination you want.
* ''Cross Beats'', despite being a primarily single-player game (with some social elements), has always-on DRM that accesses the server for literally everything, including every menu transition. Perhaps not so coincidentally, less than 2 hours after launch, the whole game had to be temporarily taken down for "emergency maintenance". The official reason given was "user data load distribution processing", which was most likely PR-speak for "Oops, we accidentally [=DDoS'ed=] ourselves."
* Even UsefulNotes/{{Steam}} is not immune to this. If you haven't run it in a while, you may discover that it's trapped in an infinite loop: it attempts to update itself, but aborts partway through because your internet connection faltered for a few milliseconds, and it quits without saving what it's already downloaded. [[Catch22Dilemma You can't run Steam because it needs an update, and you can't update Steam because you can't run it.]] You'd think the solution would be to download the update outside of Steam, but Valve doesn't provide any way to do so.
** The solution for this was to re-run Steam and go through the entire download process again. Of course, then your internet hiccups again, and you spend half of the day repeating this over and over until you get lucky and the update completes without your internet hiccuping even once, just so Steam is updated and you can get into the damn game you paid for.
* The process of installing ''VideoGame/{{Battlefield 3}}'' can be used as argument as to why EA should stop forcing its Origin platform on their physical releases and didn't really provide a good first impression of the service. The first is that Origin treats installing from a DVD as "downloading" the game, which feels kind of odd when you can't pause and resume it. The second is you cannot quit Origin while downloading and installing an update, which is something Steam can do. If you want to run the game, you also need to logon to their Battlelog website. And lastly, if this is a clean install, you have to install a plugin for your browser so the website can launch the game (which sounds kind of fishy; they eventually added an option, albeit never advertised and hidden under about three different menu screens, to enable plugin-less support). But overall, ''you do not launch Battlefield 3 from Origin, you launch it from the Battlelog website'' - attempting to run ''Battlefield 3'' the same way you'd run any other game simply brings up a new tab in your browser that brings you to Battlelog (unless the service is offline for any reason, at which point it runs the singleplayer like a normal game).
* One quite infamous example is ''VideoGame/RainbowSix Vegas 2''. The game can't be played unless the game detects the CD is inserted in the PC's disk tray. However, when it came time to release the game digitally, they couldn't reverse the process. What are our [[SarcasmMode genius developers]] to do? They literally torrented a crack of the game, and passed it off as an update patch. They also did nothing in trying to hide it.
* SEGA System 16, 18, X, and Capcom System 2 all share a unit nicknamed the "suicide battery", a battery placed on the board that holds essential game data. If that battery ever dies out, ''your game will never work again''. This is why you see a lot of arcade enthusiasts retrofitting their machines with an emulator instead, because that battery is just too much of a hassle to deal with.

[[folder:Games That Play Dumb]]
There are times when even a Core i7 processor will meet its match, and when even an amateur programmer will shake their head in disgust.

''Note: Consider if the entry would fit more as an ObviousBeta or a GameBreakingBug.''
* ''Jan Ryu Mon'', an online TabletopGame/{{Mahjong}} game by [=PlayNC=], which has plenty of evidence suggesting it was programmed by drunken monkeys:
** If a cookie gets blocked from the site (most notably the registration page), the server gives a "Fatal Error" page with a stack trace -- no indication whatsoever that the problem is a blocked cookie.
** The login only works in Internet Explorer -- attempting to log in on any other browser will result in a "Please use Internet Explorer 6.0 or higher to log in" pop-up message as soon as you clicked on the username or password fields, and then ''repeat the pop-up every time'' you try to type a letter, click on one of the fields, or click the "Login" button. There is absolutely nothing on the site that doesn't work in Firefox, except...
** Even though the game ran in a separate .exe file, it would give an error message on startup if the game executable was launched directly. The only way to launch it to download the [=ActiveX=] extension (for IE only) from the game's site, log in, then use a button on the site to launch the extension for the sole purpose of launching the game -- the extension would pass along your login information to the game, and they apparently didn't think to make the game executable ask for a login.
*** Scary fact: this is becoming an extremely common way to handle log in "securely". Even VideoGame/FinalFantasyXIV is using a variation with an embedded IE session in a "login client" window. Try loading quite a few [=MMOs=] with your Internet connection offline and watch the fun.
** Then when you logged in, the game was ridiculously slow because of the eye-candy animations for ''every single turn''. This was further exacerbated by the graphics engine, which was so inefficiently programmed that the game experienced more lag and frame-skip to show a Mahjong table with ''one hand moving one Mahjong tile'' than ''VideoGame/{{Touhou}}'' games do trying to animate ''2000 bullets simultaneously''. This is often caused by graphics engines which cache nothing or very little, instead opting to re-render most or all of the screen from scratch on every single frame.
** To add insult to injury, their server had major routing issues during the beta (and still have some as of this writing), forcing many players to go through a proxy just to connect, which also meant a LOT of lag -- a round would be easily half an hour, when many other online Mahjong games can finish a round in 10-15 minutes.
** And then there's a [[https://www.youtube.com/watch?v=iJHG7ZBwi3A plethora]] of [[https://www.youtube.com/watch?v=uVi5A45Ij4Y random]] [[https://www.youtube.com/watch?v=_7dFJ6jSkS4 crazy]] {{Game Breaking Bug}}s.
** Plus the game occasionally deals a hand of [[https://www.youtube.com/watch?v=gBIUUsVmCTU#t=8m35s 13 Haku tiles]], even though there's only 4 of each tile in a set.
* ''VideoGame/BigRigsOverTheRoadRacing''. It's an [[ObviousBeta obvious pre-alpha build]] of a game, with no collision detection, no AI, and some incredibly bad programming of physics (as in, go as fast as you want – Warp 9000, if you have the patience – but ''only in reverse''). Despite being obviously an alpha, the creators ''tried to sell it as it was anyway''.
** In one of the strongest cases of "why bother?" in history, they added AI in a patch... which causes the truck to drive in a fixed course at a rate of 1 mph and stops short of the finish line. The reason why the opponent stops is because it has finished the race, but there is no loss condition in the game. That's right, the game simply doesn't allow you to lose.
** Something in the game's code causes it to be a severe memory vampire -- it uses ''half'' the available RAM when it's running. No one knows why. Lord knows nothing in the game itself supports its needing anywhere near that amount.
* ''VideoGame/SonicTheHedgehog2006'' is an ObviousBeta -- probably the most infamous example from a AAA-publisher -- rightly criticized for all manner of things. But probably the biggest complaint is its LoadsAndLoadsOfLoading, particularly in the hub worlds. This is because the game loads ''the entire hub world'' whether it's needed or not... even for a single line of dialogue.
** In several cases, talking to an NPC would cause the game to reload the entire hub world from the disc -- even though you were ''already in the hub world''. Once it's finished, the NPC would speak a ''single line'' of dialogue, and then it would ''load the entire hub world again''. This is subverted on one of the missions in Shadow's story as you are given the instructions first, and then it loads once. You'd think they let all the missions to be like this by bypass the very ludicrous and frequent load times.
** This issue is demonstrated well during one of the boss rounds -- one bug will let you explore the whole hubworld, even though only a small portion of it is used in the fight. Of note, the textures for certain locations won't load if you do, but the [[SkewedPriorities coding]] ''will.''
* ''VideoGame/{{Action 52}}'' for the NES is already a masterpiece of broken game design, but one of the key factors for some of the glitches that can happen in the game are caused by the ''mapper in the cart'', not just in the coding. ''Action 52'' uses Mapper 228, which is unique as it's similar to the ones found in most bootleg multi-carts. Every game made for the NES has different types of mappers, and some of them have their own coding that can pull off some effects - like in the Japanese version of ''Contra'', there's snow blowing in the mountain level, which was removed in the international versions to cut the costs. If you were to use a mapper, you need to be aware that the console can only read information from it in a specific way, or else you'll run into problems. Even then, it's not guaranteed to work because for the most part, the NES isn't made to read unlicensed mappers, especially if you were to program something in a certain way that the NES couldn't handle. One notable example regarding this issue is that ''Alfredo and the Fettucini'' and ''Jigsaw'' always crash when you try to load them normally, although most games on the cart start just fine. [[labelnote:Why is that?]]If you open the game ROM by using a hex editor like [=XVI32=], you would discover that both games share the same ''bootup screen''. It's not only the same screen that lets you choose between one or two players, but it also ''executes the game''. Mapper 228 is supposed to overlook that problem, but most emulators and even the NES itself doesn't support it, and it didn't understand the redirection.[[/labelnote]] Of course, ''Action 52'' is already infamous as it is for boasting so many games in one cartridge, and not a single one of them got past ObviousBeta or is even considered acceptable for alpha standards.
** The mapper issue is the same reason why the unreleased ''Cheetahmen II'' stops at Level 4, as it uses the same cart as ''Action 52''. While this is fixed in unofficial patches and the still incredibly buggy re-release, if by some miracle you own the original, you need to fiddle with the cartridge upon inserting it into the system so it'll boot up on the inaccessible levels.
* The ''VideoGame/TetrisTheGrandMaster'' clone ''Heboris'' – specifically, the [[GameMod unofficial expansion]] – has more or less died out, because attempts to peek into the source code, much less make any further modifications, have proven futile due to the game being a messy hybrid of a gaming scripting language and C++ .
** On a related note, there were a handful of genuine C++ ports of it. However, the MINI version (which allows for "plugins" to define additional game modes and/or rotation rules) is the most commonly used version, and the way it works pretty much inhibits any attempt at porting it entirely to C++.
* ''Zero Gear'' isn't problematic in and of itself... but the nature of its UsefulNotes/{{Steam}} integration allows it to be used to play any Steam-locked game you want, without owning the game. This is most notably used by hackers to bypass VAC bans: just start a new account, download the ''Zero Gear'' demo, and copy the files over.
* ''Franchise/{{Pokemon}}'' may have reached the levels of technical genius in the Game Boy days, but they can still dip into the other side of the spectrum:
** The save system in ''[[VideoGame/PokemonDiamondAndPearl Pokémon Diamond, Pearl, and Platinum]]''. Making any changes in the PC storage system, let alone ''simply opening it up'' increases the save time from 5 seconds to '''15''' seconds due to an overly secure encryption system that makes sure that '''every''' slot in the storage system isn't corrupted through a hefty checksum review. Are you the type that likes to save regularly? Sucks to be you! The sad thing is, it saves the game the same way ''[[VideoGame/PokemonRubyAndSapphire Pokémon Ruby, Sapphire and Emerald]]'' used to do, and those games consistently took about five seconds to save.
** Not to say that ''they'' did nothing wrong either. ''Pokémon Emerald'' has a flaw in its RNG system, it always starts at the value 0 due to a coding error. This causes each "frame" from the start of the game to be exactly the same. The end result; the player is left with a predictable sequence of personality values and therefore exactly what stats and nature they will get per time waiting in-game. For example, if the player encounters a pokemon exactly 100 seconds from starting their cartridge, they will always receive this spread: Relaxed 22/12/10/10/29. Similarly if a ''VideoGame/PokemonRubyAndSapphire'' cartridge has a dead or removed battery the game will always start its RNG at 0x5a0.
** As innovative as ''VideoGame/PokemonXAndY'' were, they had their fair share of problems. The profanity filter is notoriously arbitrary, [[ScunthorpeProblem blocking innocent nicknames because they contain profanity in other languages]] (e.g. "Violet" because "viol" means "rape" in French). This is particularly bad because some of the censored words become legal under certain circumstances (e.g. having any other character attached to either end of the word) while others don't have that distinction. The companion app ''Pokémon Bank'' is also notorious for letting through hacked Pokémon (which it's supposed to block) while rejecting legitimate ones. And of course, at launch, Game Freak forgot to encrypt the games' wi-fi communications, enabling unscrupulous players to use a program to identify what moves their opponent is using each turn in a match; the version 1.2 patch (mandatory for online communication) fortunately fixed this oversight.
** Nintendo's means to try and stomp out piracy for ''VideoGame/PokemonSunAndMoon'' are especially questionable. Since they couldn't differenciate whether a copy was legitimate or not and since a complete build of the game leaked over a week in advance, they decided to ban any account that used ingame online services before the official release. However, this meant that even legitimate copies that just happened to arrive early, be it through broken streetdates or early shipping, were treated the same. To add insult to injury, the hackers quickly found means to bypass the bans.
* Sometimes, ''VideoGame/DungeonFighterOnline'' gives you a broken page showing on the launcher, saying that the navigation to the page was canceled. Since the entire program is based on Internet Explorer, even if you use the Firefox plugin made by users at DFO Nexus, it won't access the page, giving most of the time a timeout error. The worst part, however, is that it could not only break the login page, and a bug regarding SSL certificates could show you the login to pages that are not from Neople, such as godaddy.com or datameer.com. Fortunately, this stuff seems to be fixable by resetting the router.
* Indie game developer and [[SmallNameBigEgo utter egomaniac]] [=MDickie=] has released the [[http://www.mdickie.com/code.htm source code]] for many of his games, including the infamous ''VideoGame/TheYouTestament''. By looking at the code, you discover for instance that some "exit program" functions work by ''deliberately crashing the game''. And that's just the tip of the iceberg.
* The open-source Windows port of ''[[VideoGame/{{Syndicate}} Syndicate Wars]]'' eats insane amounts of disc space for no logical reason.
* ''Death Trap'' is already an [[SoBadItsGood/VideoGames infa]][[Horrible/VideoGamesOther mous]] point-and-click horror game, but one of more notorious aspects about it is how the game works. In [=ActionScript 2=][[note]]the scripting language in Flash, which is now succeeded by [=AS3=][[/note]], there are two commands called "gotoAndPlay(frame number)"[[labelnote:explanation]]goes to a chosen frame number and plays the rest of the movie[[/labelnote]] and "gotoAndStop(frame number)".[[labelnote:explanation]]same as above, but stops the movie[[/labelnote]] These two bits of code are meant to be used for the last frame in buttons called "on(release)", and you could only use one of the two bits of code or else it won't work. These actions are not mandatory for Flash projects, but the reason why it's under Idiot Programming is because they're misused. They're supposed to be used for really basic stuff, like buttons in menus; but they're not meant for more complex stuff like, say... movement buttons in point-and-click games; you can guess that the author used them because of how simple they were. Because of the fact these two commands work with the main timeline, it's impossible to make a traditional point-and-click game using this set up and expect it to work without making it really linear. And that's exactly what the author did—he made the game really linear.
* Back in the days of DOS and Windows 9x, many games (such as the PC version of ''VideoGame/SlaveZero'', as Lowtax discovered in one of the first articles he ever wrote for Website/SomethingAwful) were hard-coded to assume that the CD-ROM drive was D:, rather than actually bothering to check with the OS to see that this was the case. If you had more than one hard drive, or had your disk partitioned -- which a lot of people with >2GB hard drives had to do prior to Windows 98 arriving on the scene -- you were generally out of luck unless you could either edit the game's configuration files or find a No-CD patch.
** Ubisoft apparently decided to resurrect this little relic of the [=90s=] with ''VideoGame/AssassinsCreedUnity'' and ''VideoGame/AssassinsCreedSyndicate'', which will crash on boot-up if they're installed anywhere other than the C: drive. It ''is'' possible to get them to work while installed on another drive, but requires copying several files and creating symbolic links to the C: drive.
* ''VideoGame/{{Magicka}}'' is a very fun game, but sometimes it's so difficult to work with that it's ''almost'' not worth the trouble.
** The game takes almost two minutes to start up. That's ''before'' the company logo shows up, too. You just sit there staring at a black screen for two minutes. (Thankfully, you can skip straight to the title screen once the company logos ''do'' start appearing.)
** The graphics are nice, but that's because of its superb art direction. Technology-wise, it's a fairly standard 3D top-down brawler... And yet it chugs down resources like a cutting-edge first-person shooter. A laptop that runs ''Team Fortress 2'' and ''Left 4 Dead'' can't keep up with ''Magicka''.
** If someone's connection drops during a multiplayer game, there is no way for them to re-join. The remaining player(s) must either go on alone, or quit and go back to the lobby.
** The developers acknowledged the game's disastrous launch with a series of blog posts and weeks of patching that made the experience more bearable. Then, with their characteristic and warped sense of humor, they introduced a free DLC called "Mea Culpa" that gave each player character a set of gag items: A "Patched" robe, a "Buggy" Staff, a "Broken" Sword and a "Crash To Desktop" magick.
* The developers of the ''VideoGame/AnnoDomini'' series need to be slapped with some basic GUI guideline books. For example, the first game would only save after you hit the save button, but not just after naming the save game. It would also completely remove the game directory on uninstall, including save games and settings. Anno 1503 (the second game), never got its promised multiplayer mode. Anno 1404, released in ''2009'', still assumed that there would be only one user, and that this user would have admin rights.
* The installer for ''VideoGame/DukeNukemForever'' seems to have been programmed by someone with an "everything but the kitchen sink" mentality. Not only does it install a bunch of applications and frameworks that the game doesn't actually use, but it installs the AMD Dual-Core Optimiser, regardless of whether your CPU is multi-core, or even made by AMD.
* ''VideoGame/EveOnline'' had a stellar example of what not to do in programming with the initial rollout of the ''Trinity'' update. Eve had a file called boot.ini that contains various parameters... but it's also a critical system file stored in C:\. A typo in the patch caused it to overwrite the version in the root directory, not the EO folder, resulting in an unbootable system that had to be recovered with a rescue disk. This is why you never name your files after something that already exists in the OS. (Since that debacle, the file in question has been renamed start.ini.)
* {{Creator/Valve}} took it ''UpToEleven'' by [[https://github.com/ValveSoftware/steam-for-linux/issues/3671 making its Linux Steam client delete all files on the machine it's running on if its directory is changed]].
** To make matters worse, Steam's uninstaller also [[https://support.steampowered.com/kb_article.php?ref=9609-OBMP-2526 deletes everything in the folder it's on]].
** When adding a game library that already has games on it, Steam may delete the entire thing and start over. Even if you have all your games on it.
* ''[[Videogame/{{Myth}} Myth II: Soulblighter]]'' had an uninstaller bug that was discovered after the game had gone gold. If you uninstalled the game, it deleted the directory in which the game was installed. It was possible to override the default install settings and install the game in the root directory of a given drive. Fortunately, only one person suffered a drive wipe as a result (the person who discovered the bug), and they actually replaced the discs after the copies of the game were boxed, but before the game was shipped. Still, it was a fairly glaring blunder.
* ''VideoGame/{{Diablo}} II'' has fairly simple mechanics due to its nature as an online title released in 2000. That did not prevent Blizzard from introducing bugs in literally every skill tree and about 20% of all skills in the game. Bugs range from the curse resist aura getting weaker as you put points into it and a rapid fire melee attack that misses completely if the first swing misses to masteries claiming to increase damage on spells and items but not actually doing it, Energy Shield bypassing resistances meaning your mana is drained in 2 fireballs, and homing projectiles going invisible if fired from off screen. Lightning bolt spells ignored faster cast rate items for no particular reason. Berserk correctly set your defense to 0 when you used it, then if you used it again it would give you negative defense and after a while it would roll around and give you 8 million defense. Numbers were wildly off on release: the high level Lightning Strike spear skill would do a total of 50 damage at maximum spell level, and the poison from reanimated skeletal mages would do '''1''' damage per second over the course of five minutes. And that's just spells: there were also numerous dupe bugs, ways to teleport to locked maps, the list goes on.
** Infamously the game was only considered difficult for three reasons: about half of the combinations of random enchantments a boss could have would interact in bugged ways and result in an instakill in some way (fun example: the combination fire enchanted/lightning enchanted would erroneously add the (huge) damage from the fire enchanted death explosion to the damage of every single one of the 10+ lightning sparks emitted every single time the boss was struck), the poison clouds of claw vipers would invisibly deliver their melee attack 25 times per second resulting in a RRRRRRR sound and a very quick death, and gloams drain a slight amount of mana on attack but also seem to deal 256 times that amount as damage whenever they hit you with anything.
* ''VideoGame/DiabloIII'' had a particularly funny example of this. When the game launched, the auction house gave you a 5 minute timer to cancel your auctions and beyond that they were stuck in the auction house for two days (A patch was added in the future to allow canceling). The fans found a workaround to cancel their auctions which involved setting your computer's time to the day before you put the auction up and the cancel timer would appear again. Why was it programmed to use your computer's time instead of the Battle.net servers' time we will never know.
* ''VideoGame/SpaceEmpires V'', unlike all its predecessors, is notorious for being a resource hog, bringing even the mightiest of [=PC=]s to their knees, despite running only on [=DirectX=] 8:
** The battle replay files generated by the game, instead of storing "Ship A shot at ship B and hit it for 15 damage to the armor.", instead store "Ship A fired bullet X. 50 milliseconds later, bullet X has moved a few pixels. After another 50 milliseconds, bullet X has moved a few more pixels. Repeat ad infinitum. Bullet X hit ship B for 15 damage to the armor." Thus, battle replays are frequently in the tens of megabytes -- per turn!
** Every time you load the ship list screen, the game loads into memory all the data about all the ships in the game, regardless of if it is actually going to be used in the list's current view mode, delaying the screen's loading by upwards of a second. Fortunately you can disable the loading of some of the larger bits of data, such as order queues, but then you can't see it in the list when you bring up the views that use it...
** The game's developer apparently has no idea about the existence of multicore [=CPUs=], and made the entire game single-threaded; thus, running the game on a 2GHz quad-core will be noticeably ''slower'' than running it on a 3GHz single-core.
** Processing a turn for a multiplayer game when the game is set to fullscreen mode is much slower than processing the same turn when the game is set to windowed mode, in which the game doesn't bother drawing anything, but just processes in the background. Surely rendering a progress bar can't be all ''that'' taxing? (In fact, ''everything'' is slower in fullscreen mode for some reason... isn't that kind of backwards?)
** The game has a scripting language which you can use to write random event and espionage scripts... but the language is horribly broken in the way it handles order of operations. Calculating 5 + 3 * 4 will give you 32, not 17 -- but that's just the tip of the iceberg. Somehow you can manage to take 1, then repeatedly subtract and add 1 from it alternatingly, and wind up with any negative number you want, simply by repeating the subtraction-addition cycle the right number of times!
* The graphics engine for ''Need For Speed: Most Wanted'' (2012) literally cannot run above 30FPS without massive framerate drops. The framerate drops have no correlation to the amount of action being rendered on-screen. There are framerate drops in the loading screen animation, where there are no polygons rendered whatsoever. This was discovered by people who noticed that locking the framerate to 30 in the PC version's configuration file eliminates the gratuitous slowdown that the PC version of the game is known for. But here's the thing though: The game's graphics engine is a modified version of the graphics engine that powered the last Criterion NFS game, Hot Pursuit (2010), which could run perfectly fine at an uncapped framerate. How could a company in charge of a big-budget racing game series modify an existing, completely proprietary graphics engine and accidentally break the engine's high-framerate capabilities?
** In ''Need for Speed: Rivals'', meanwhile, the (locked at 30 FPS) framerate is directly bound to the real speed of the game. Setting it to 60 FPS through third-party software makes the game run at ''double speed'', resulting in screwy behavior of all sorts including broken physics.
* ''VisualNovel/WelcomeToPonyville'' was a relatively short ''WesternAnimation/MyLittlePonyFriendshipIsMagic'' visual novel, and yet it was over 1 GB in size. This was apparently due to the fact that the programmers appeared to have no knowledge of audio compression: for example, a brief file for a closing door was over 32 MB (with several minutes of silence included in the file to boot). Some have estimated that if the audio files were properly compressed along with the sprites the game could have been 60-90 MB.
** To make matters worse, after one reviewer criticized the game's technical flaws [[SmallNameBigEgo they proceeded to block his social media accounts so that he could no longer contact them, and refused to take any advice from other programmers]]. Some even suspect that their shutdown was not due to hackers like they claimed, but embarrassment after an unofficial 5MB Flash port was released on the internet.
** Even worse, a lot of the files in the zip file ''were [[DummiedOut unused]]''. Literally every character folder has at least one unused file that's never called in the game.
* The original ''Videogame/PlanetSide'', a 2003 MMOFPS with up to 111 players on each of the three empires fighting on each individual 20x20km continent, handled ''everything'' clientside, with the game only checking up with what the server knows every couple seconds. A significant amount of the gameplay was devoted to exploiting the hilariously bad netcode ("ADADA" strafing relied on the game not realizing when players shift their strafing direction, [[TeleportSpam causing them to teleport from side to side on other player's screens]]). Because everything was clientside, the game was also one of the easiest MMO games to hack. Common hacks included warping all players on a continent to stand directly in front of you (on your screen) and kill them, causing players from across the continent to spontaneously fall over dead, or enabling a hack to disable weapon cone-of-fire, resulting in sniper shotguns and Gatling guns that could pick off players from a thousand meters out. Even without hacks, the game was pitifully easy to exploit. Players would stand somewhere safe, unplug their Ethernet cable, then run around the battlefield knifing enemy players to death. Once the player is happy with the amount of kills, the player would plug the cable back in and bask in huge amounts of XP as half the battlefield drops over dead -- though this was eventually fixed by locking the game's controls after a few seconds with no server communication. For a long time, having a certain type of dual-core processor would cause the game to run in turbo mode, causing vehicle and players to move and fire several times faster than normal. Thankfully, ''Planetside 2'' uses much more serverside authentication, which has fixed almost all the issues with the first game's idiotic netcode.
** The original Planetside is still refered to as 'Clientside' to this day in some fan circles
* The Android port of the classic SNES RPG ''VideoGame/ChronoTrigger'' has a rating of 3 stars on Google Play. How, you may ask, does such a beloved game get such a low rating? The reasons are many. First off is the file size. ''Chrono Trigger'' was completely rewritten for smartphones to include touch-based commands and movement. The result is a ''37 MB'' app (the original SNES game was just a few megabytes in size). Another gripe is the game's very broken DRM system -- to play the game, the user is ''required'' to have an Internet connection so that the app can authenticate itself and download the various areas of the game. That's right -- this 37 MB app contains ''no'' game content whatsoever, apart from the title screen. What's more, the game automatically quits (more like crashes) if Internet connection is lost. Without saving the game. Many a ''Chrono Trigger'' player has lost an hour or more of game time when his or her device switched from 3G or 4G to a home Wi-Fi network (phones momentarily cut off Internet when they change from 3G/4G to Wi-Fi). Even more egregious is the fact that the game is ''incompatible'' with many popular devices, even the Samsung Galaxy S3, a phone much more than capable enough to run this game. In short, this port of ''Chrono Trigger'' is perhaps the worst console-to-mobile port ever.
** The nail in the coffin? The UsefulNotes/NintendoDS port of the game had already existed for years, set a precedent on how to use touch controls (albeit with a second screen), and even with several hours of additional content left a lot of empty space on a DS game card. There is absolutely no excuse.
* The initial rollout of ''VideoGame/TheWitcher'' had atrociously bad back-end code. This is best seen when upgrading to the Enhanced Edition. Despite the latter ''adding'' no small amount of content, the same computers ended up loading as much as '''80%''' faster than before. That's the difference a little optimization can make.
** Playing Poker Dice in the first title is heavily prone to crashing [=PC=]s. Not just the game itself, but the ENTIRE operating system, to the point where it forces a reboot.
* ''VideoGame/TheSims2'' had its corruption trigger being ''simply deleting a Sim or family'', a feature that's more than easy to access. Even modders can't fix the problem. Worse still some of the premade families that ship with expansion packs are inherently corrupted (although it can be rectified by replacing the expansion packs Sim Bin template with empty template made by modder, preventing those inherently corrupted families from spawning on newly made neighborhood or updated neighborhood).
* ''VideoGame/TheSims3'' is known for having a large number of expansion packs...and for having a horribly wasteful method of installing them. Every time you install an expansion pack, another copy of an identical set of help files (in all languages) is installed on the machine. Plus, if you uninstall Sims 3 itself, it removes the majority of the actual content of the expansions -- but not these extra help files, for which you must run the expansion pack's own uninstaller.
** The game's engine itself is reportedly so broken that made Bethesda games at launch look perfect, in which without fanmade tweaks there's a lot of game breaking bugs, and FPS limiter on high-end hardwares is necessary to prevent system overheating due to the lack of internal FPS limiter.
* ''VideoGame/SplinterCellBlacklist'' is plagued with issues, such as poor multiplayer support because people can't seem to connect to one another (and everything goes through Ubisoft's [=UPlay=] service) and where the game's saves are stored. You would think if you regularly back up your saves that they would live in the My Documents folder or the App Data folder. Nope, they live where [=UPlay=] is installed, usually in the Program Files folder. And to make matters even worse, the permissions for the saves folder is left open to anyone. Which wouldn't be a problem... if you don't consider everything in Program Files requires admin privileges to write files into...
* The notoriously bad SuperMarioWorld GameMod ''Super Mario Superstar 1'' has a weird example of this in the final battle. Not only does the final boss somehow completely fail to do absolutely anything (to the point no one can figure out what it was supposed to do) and make it UnwinnableByMistake, but somehow, via some absolute miracle of shoddy programming... the boss' graphics seem to glitch if you enter its room through one door and not the other. As in, you go through one level ID and it appears correctly (with bad Microsoft Paint style graphics), you go from another level ID and the graphics are glitched tiles. No idea how the programmer pulled that kind of fail off.
* Speaking of resolutions, the PC version of ''VideoGame/DeadlyPremonition'' (hilariously named the "Director's Cut") is such a bad port that the game has only ONE possible resolution to be played at: 1280x720. While this is High Definition, most PC monitors are able to render bigger resolutions, as the lowest quality LCD Monitor has a native resolution bigger than that, and any CRT monitor has a different aspect ratio. And the game was released in 2013, so we weren't so dated on the monitor department. Thank goodness, being PC and all, [[http://blog.metaclassofnil.com/?tag=dpfix modders where there to do what developers couldn't]].
** Albeit version 0.9.5 of said mod causes a lighting glitch the previous version didn't.
* ''VideoGame/CallOfDutyGhosts'' is apparently a sloppily developed game. It weighs in at a hefty 30GB and requires 4GB of RAM (8GB recommended). You'd think this game would be amazing on all fronts... until you find that the game might be a little bloaty, will happily eat your RAM (it even stutters on the loading movies after the level is done loading), and looks no better than previous installments. Compare this to the TechDemoGame TropeCodifier ''Crysis'' which runs just fine on the recommended requirements (a "wimpy" dual core processor and 2GB of RAM), it just needed a graphics card from its future (and even then, only 2 generations ahead) to run on absolute max settings at 60FPS.
* The online play in ''VideoGame/{{Meteos}} Wars'' did not seem to take lag into account whatsoever: As a FallingBlocks PuzzleGame, it only needs to keep track of the other player's incoming blocks and controller inputs. Instead, it seems to send data between players about the exact locations of every block as they're falling, resulting in much more lag than is necessary. If the connection becomes unstable, instead of taking guesses and correcting itself later, as is seen with all other falling-blocks puzzle games with online play, it just stops the action outright until the signal restabilizes, or terminates the match outright if it's taken more than a few seconds. All the while, as the online play shifts between slow and immobile, the timer counts down in real time, functioning totally independent of the game, resulting in nearly every online match ending in a time-out instead of elimination. Geographic distance seems to profoundly affect the lag too: The closer the opponent is to you, the less lag there is, and it plays almost normally if your opponent is within about 300 kilometers. Inversely, the game is almost unplayable if up against someone from another continent.
* When it was announced that the PC version of ''VideoGame/{{Titanfall}}'' would require a massive 48GB of hard drive space, many assumed that it was a sign of developers finally starting to include visual assets that far exceed those of typical [=360/PS3=] ports. As it actually turned out, a whopping 35GB of the game's installation was taken up by ''uncompressed audio files'', in every single one of the game's twenty languages. The developers' rationale for this was because they thought the audio decompression would take up too much CPU power on older and/or bargain-basement [=PC=]s... never mind the fact that any CPU which struggles with anything as simple as audio decoding probably wouldn't be powerful enough to run the game in the first place. Had they compressed the audio files and actually made separate installers for different languages, the game install would probably only be about 15GB.
* Fargus Multimedia's Russian bootleg of ''VideoGame/PuttPutt Saves the Zoo'' is a complete collection of failure. Alongside having enough problems as it is (characters commonly speak in the wrong voices and their lips keep moving after finishing a line of dialog), they made, quite possibly, one of the biggest fails possible -- the game was originally ''completely unplayable''. Why? They packaged the game with ''a blank [=W32=] file'', the file that ''executes the game.'' It wouldn't be until 2004 that fans would fix this by using torrents to grab an American [=W32=] and stick it into the Russian version.
* The DOS versions of ''VideoGame/{{Mega Man|Classic}} 1'' and ''3'' were programmed to base the speed they ran at on that of the computer's processor, with no upper limit. This isn't an inherently poor choice, as most DOS games did the same, not predicting the speed at which technology would improve. What sets it apart from those games is that while they run too fast to play on modern computers, ''Mega Man 1'' and ''3'' ran too fast to play on computers ''on the market at the time of their release.''
** The PC port of ''VideoGame/Sonic3AndKnuckles'' had the same issue. When it ran in windowed mode, the game's framerate was based on the speed of the processor running it, which even when the port was first released resulted in a ridiculously-fast, unplayable game. Fortunately for its case, playing the game in fullscreen instead bases the framerate on the refresh rate of the monitor, which even two decades after will still usually be around the same framerate it was meant to run at.
* ''VisualNovel/DontTakeItPersonallyBabeItJustAintYourStory'' closes by forcing itself to crash.
** The version of ''VideoGame/PlanescapeTorment'' on the ''Dungeons and Dragons Anthology'' collection does the same thing, which is odd, given the otherwise fine carryover job.
* ''Super Monkey Daibouken''. Despite being for the Famicom, as opposed to the Famicom Disk System, it had to load regularly. And that's to say nothing of the game's battle system, which is too inconsistent to be remotely coherent.
* ''VideoGame/HoshiWoMiruHito'' had a world map so badly programmed that the display just spat out tiles at random. The display never showed your total hit points, and the battle menu was so broken that you couldn't even access it without progressing in the story.
* ''VideoGame/LEGOIsland 2: The Brickster's Revenge'' is infamous for its LoadsAndLoadsOfLoading, on par with the likes of ''VideoGame/SonicTheHedgehog2006''. The explanation behind this? The game prioritizes ''rendering the loading screen over actually reading any data.'' This means that for every frame that is rendered, a small amount (as in, ''a single byte!'') of information is read, which causes the game to spend an eternity loading things for absolutely no good reason. The kicker? This can be fixed by changing '''a single instruction''' in the game's EXE file, which instead changes the logic to read all the data before rendering a single frame, and the loading is minimized to only half a second rather than minutes on end.
* The netcode for the original ''VideoGame/DarkSouls'' is infamous for being terrible. It can take upwards of a half-hour to summon a friendly player, and sometimes the game will think that you're being invaded by an unfriendly player, blocking you from exiting an area for upwards of ten minutes until the problem resolves itself. This isn't even getting into the [[PortingDisaster PC port]], which uses the much-maligned Games For Windows Live.
** ''VideoGame/DarkSoulsII'' attempted to fix this problem by adding dedicated servers for matchmaking, which did alleviate the issues somewhat. Unfortunately, the dedicated servers are '''only''' used for matchmaking; once you actually connect everything is handled as a peer-to-peer connection, meaning astronomical lagspikes and severe HitboxDissonance if you happen to connect to another player across the world.
** And it seems like From Software didn't quite learn their lesson the first time; ''VideoGame/{{Bloodborne}}'''s netcode is almost as bad as the original ''VideoGame/DarkSouls'' and has all the same issues. It also had terrible environmental optimization before patches, leading to LoadsAndLoadsOfLoading.
** ''VideoGame/DarkSouls3'' had an anti-cheat software so poorly programmed at launch that it would often mistake ''the coding in the game itself as cheats'', ejecting customers who had done nothing at all from online play. Sometimes even simply ''switching from offline to online mode'' could trigger the anti-cheat. Fortunately, Creator/FromSoftware did eventually go through flagged accounts and removed many of the resulting bans.
* Ah, ''VideoGame/{{Minecraft}}'': a beautiful world that rewards creativity, a multiplayer experience like no other, endless exploration to wherever your heart desires, and the uncanny ability to bring modern quadcore computers to their knees. How is this possible? How does a game with graphics so simple they should run fine on a Pentium 2 require a modern gaming box to play at 1080p without stuttering -- let alone if you dare to apply graphic-improving mods? Well, let's just say there's a reason nobody writes 3D games in Java. There were even third-party attempts to recreate the game in C, but obvious legal issues caused them to be dropped, to the groans of competent programmers the world over.
* The netcode in ''VideoGame/JojosBizarreAdventureAllStarBattle'' is ''terrible'', resulting in mind-blowing amounts of lag no matter how good your connection is. This thankfully doesn't affect Campaign mode (where you fight other players' "avatars", and through which much of the content is unlocked) because that only uses the connection to look up what other players have set as their characters (the actual fight is you versus the AI).
* ''VideoGame/DragonAgeInquisition'' was loaded with bugs from the very beginning. The most unparalleled among them, however, is the notorious Patch 4. It's bad enough that the whole thing was nearly seven gigabytes large, but there's a programming error in the Xbox One port which causes it to ''uninstall and reinstall the entire game''.
* ''VideoGame/BanzaiPecan: The Last Hope for the Young Century'' is not without its problems, however none were as egregious as the palette-swap feature for its main character. Without using them, the game usually takes roughly 4 seconds to load the game, but if you use any of the palette-swaps (save for her [[spoiler:[[SuperMode Awakened Mode]]]]), it extends the load times from 4 to nearly '''16''' seconds. What makes this even more ironic is that the game can load the most basic enemy type the game in many other colors without any issues.
* ''VideoGame/GrandTheftAutoIV:'' The i386, Microsoft Windows, port of the game is commonly cited as an example of this trope before patches were released. Dog-slow performance on the most commonly available computers of the time, and a graphical settings menu where setting the quality to low actually made the game run ''slower''. True, patches help alleviate theses problems, and the game engine boasts its share of [[GameMod game mods]], but the game became a textbook example of [[PortingDisaster shoddy PC ports.]] Also, the game required the infamous ''Games For Windows'', a program that is discussed in Microsoft section of this page.
* When ''Videogame/{{Disgaea}}'' came out on PC, it came out in a rocky state.[[http://steamcommunity.com/app/405900/discussions/4/412448792361194752/ One modder]] decided to reverse engineer the game and see what was causing the problems, discovering that much of the game relied on bizarre, archaic, and ancient methods of handling basic internal things like input polling, controller support, texture coordinates and screen refreshing in [=OpenGL=], resulting in poor performance, unresponsive controls, and spotty gamepad support.
* The remake of ''VideoGame/CannonFodder'' recreates the game with a modern 3D engine -- which takes an incredibly long time to load, even on modern fast computers, and doesn't keep loading in the background if you alt-tab out of the process. Click start, go brew a cup of coffee, lazily mix it with sugar and milk, drink it while you read the paper, then come back and if you're lucky the game will be just about ready to start. And for all that, the graphics are really nothing to write home about.
* Sega, with their Mega Drive/Genesis titles on UsefulNotes/{{Steam}}, has finally opened up [[GameMod modding support]] of them through Steam Workshop, much to the rejoice of fans everywhere. However, it was soon discovered that for ''some reason'', uploaded mods that utilised SRAM support wouldn't save their data... unless the mod was listed under ''VideoGame/Sonic3AndKnuckles''. And even ''then'', it will not save all data properly if the mod's SRAM data is bigger than the aforementioned game's, as seen with ''[[http://steamcommunity.com/sharedfiles/filedetails/?id=674578957 Sonic 3 Complete]]''.
* ''Wildlife Camp'' is a PC game released in 2010 that doesn't have an option to turn fullscreen mode off. This would be bad enough on its own, but if you open up the game's options.ini file, you'll find that the game ''does'' have a fullscreen toggle -- the line of code that enables the button for it to appear has just been commented out. Re-enabling the code causes the fullscreen toggle to appear in-game and work without problem, which begs the question why it was DummiedOut in the first place, especially since being able to put your game in windowed mode is one of the most basic features a game can have.
* ''Sqij!'' for the ZX Spectrum has been described by Creator/StuartAshen as the worst game ever made, even worse than the usual contenders like ''VideoGame/BigRigsOverTheRoadRacing''. The game's code forces Caps Lock to be on, but the game only accepts lower-case inputs, meaning you can't do anything without manually exiting the game to switch Caps Lock off. Even if you do manage to get it working, you learn that the game is constant stream of collateral damage, even if you go as far as to ''patch the source code to fix the bugs''. To give a couple of examples:
** The game is riddled with severe graphical glitches such as afterimages and tearing bits off the sprites. The game does not properly track the locations of enemies, so you are deemed to have hit them if you go anywhere near them - but also to have shoot them if you fire anywhere near them.
** The code to collect objects doesn't work at all; even if you fix that, but then collect more than one piece of the "Enertree" (the goal of the game), the game believes you have collected zero pieces and won't let you drop any off. In a nutshell, you can't beat the game.
** If you enter a room with a locked door from the right, the game teleports you through the door, because it's programmed to always move you to the left when you enter such a room. Due to incorrectly reused variable names, two of the game's doors open and close constantly every time an enemy moves up or down. To add insult to injury, if you attempt to leave any room without being in the lower half of the passageway, the game crashes.
** The game is also notoriously slow. The reason for this? If you look at [[https://drive.google.com/file/d/0B1lIWldt8I1edUo4ZHBkRXdNVjA/view ''Sqij'''s source code]], the first instruction reads 1 GO TO 2 ([[DepartmentOfRedundancyDepartment the first line is solely there to point the computer at line 2]]). You will then notice that line 2 doesn't exist, causing the game for constantly search for a command that wasn't implemented.
** As a side effect of the horrendous coding, loading the game also loads the full binaries for Laser Basic, the BASIC extension used to write the game. By quitting the game you can then use this yourself, meaning the game contains a full, illegally pirated version of a £14.99 software utility. Despite this, its publisher released it '''twice''' -- once on its own, once as part of a compilation without taking the program out. Due to this, distributing and owning a physical copy of the game is ''technically illegal''.
** Unlike other examples, there's a reason why ''Sqij!'' is such a spectacular fail in the coding department. The problem might be more accurately called SpringtimeForHitler Programming, as its programmer (who was 15 at the time when he wrote it) was contractually obliged to create a game for the publisher The Power House, but fell out with the bosses and had no intention of spending any effort on it. In order to fulfill his obligation, the poor coder spent a few hours making a game so bad it was impossible for release, assuming the publisher would just reject it. He was horrified when the publisher decided just to ''release it without reviewing it.'' The game became [[BileFascination notorious]] in the ZX Spectrum community, and comp.sys.sinclair's 'bad game contest' is hosted at sqij.co.uk (now defunct) in tribute to it. The programmer in question later admitted [[OldShame the game was bad and took the whole thing in stride.]]
* The ''Franchise/FinalFantasy'' series is known for its polished production values, but was often coded in bizarre, idiotic ways.
** In ''VideoGame/FinalFantasy'', the intelligence stat doesn't work, running from battles references wrong data, most equipment buffs don't even work and most spell debuffs don't work. The only one that seems to be an AscendedGlitch due to it not being fixed in subsequent re-releases is that the critical hit rate is tied to the weapon's index in the database.
** ''VideoGame/FinalFantasyVI'' is a hot mess of GameBreakingBug content with stats that do nothing, items that don't work, status effects that hit when they shouldn't and easily accessible skills that turn the entire game universe (and your save) into a hell of glitched-out blocks.
** ''VideoGame/FinalFantasyVII'' has multiple scenes that never play as they should, because they were erroneously linked to arbitrary flags. This includes how the ability to see Aeris's ghost after her death is switched off by talking to an unrelated innkeeper in a flashback, and how a scene with Cloud and Barret discussing politics is locked away forever by a flag that triggers when going to the back of the church with Aeris. There are also scenes that can only be accessed by having far more Love Points than exist in the game, locking them out.
** ''VideoGame/FinalFantasyXII'''s ScrappyMechanic, the treasure chests that generate loot based on whether or not you have a certain piece of armour equipped, is not intentional. It was the result of an RNG that happened to be dependent on which other chests had been opened by the player, and the position of the Diamond Armlet item. Strategy guide writers soon figured out how the RNG functioned, leading many to assume this was a deliberate game mechanic.
* ''VideoGame/LittleKingsStory'' was made in a custom engine, without maintainability or future usage in mind, and myriad parts of the game relying on the 30 FPS framerate in bizarre ways. This came back to bite XSEED in the ass when they wanted to bring it to PC in 2016, resulting in a massive PortingDisaster. It was so bad that they had to bring in Durante, a master in reverse engineering, to try and fix it. While Durante managed to make the game more stable and optimized, he had to admit defeat when it came to making the game run at an arbitrary framerate, and commended the people responsible for the original porting for managing to make it run at all. [[http://xseedgames.tumblr.com/post/156725636690/little-kings-story-pc-relaunch-guest-blog Durante made a post about his time fixing the game here.]]
* Not that Creator/ValveSoftware was known for paying much attention to ''VideoGame/CounterStrike'' in the first place, but a few aspects of the game are just plain terribly coded:
** Valve Anti-Cheat. Oh, where do we start? Cheating is so endemic to CS due to this system's failures, and players playing on third-party clients with stricter cheat detection make up a massive part of the playerbase. Notably, it only scans the game files to see if a program is modifying them with no regard to what's actually happening in game; as it's fairly easy to hide a cheat from VAC, that means a player running SuperSpeed across the map getting [[ImprobableAimingSkills five instant headshots every single round]] can only be banned by manual reports, and a player that's hiding his cheats is nearly impossible to get banned.
** For some reason, player models don't reflect the way the actual player is aiming on his screen; instead of aiming exactly as the player does, to enemies the model looks to snap to them incredibly quickly in an extremely fishy way, which contributes to a lot of "crying wolf" reports that make the CSGO banning system even more of a joke. Why Valve doesn't just simply fix this is beyond the community's knowledge.
** The "mesh" layer that codes for bot movement in some maps is incredibly poorly generated, including bots taking longer to go around obstacles that no longer exist, standing out in the open because the area is coded as "cover", and, in one case, causing them to fail jumps and inflict significant damage on themselves.
* When ''VideoGame/NightsOfAzure'' was [[PortingDisaster ported to PC]], the developers didn't include the option to ''close the game'', requiring users to force-quit it from outside the program. This is one of the most basic features any game should have, yet they didn't bother to code it in. Also, if you're a keyboard user, ''Azure'' and ''VideoGame/AtelierSophieTheAlchemistOfTheMysteriousBook'' has no options to rebind keys. If that's not bad enough, the game doesn't even bother to tell which key does what!
* The licensed NES games produced by Australian developer Beam Software in the late 1980s were notorious for their botched renditions of well-known songs. This wasn't due to any lack of talent on the part of the composers, however[[note]](in fact, their main composer in this period, Tania Smith, was a frequent collaborator with none other than Music/KylieMinogue)[[/note]]; for some inexplicable reason the company's NES sound engine was only able to play music back at a tempo of [=150bpm=] and nothing else. This wasn't such an issue with the original titles the company produced, as the composers could write their music with this limitation in mind, but most of their output was licensed games for Creator/LJNToys, meaning that a ''lot'' of familiar tunes got butchered. It wasn't until near the end of the NES's lifespan that they fixed this problem.
* Creator/TelltaleGames is notorious for pushing out {{Obvious Beta}}s of their games. It's to the point where you have to question whether or not they actually bug test these things.
** ''VideoGame/StrongBadsCoolGameForAttractivePeople'' has its share of problems. Skipping the dialogue too often in the PC version will cause the character textures to disappear. There's also an infamous bug in the Wii version of ''Homestar Ruiner'' where the game freezes on Coach Z's dialogue in Extended Mode if you've set your console to widescreen mode (which you most likely have). To make matters worse, [[UnwinnableByMistake you need see this scene to get 100% completion]].
** For whatever reason, ''VideoGame/PokerNightAtTheInventory'' refuses to load if you have an Xbox 360 controller or something similar plugged in to a USB port.
** Never play the [=iOS=] version of ''VideoGame/PokerNight2''. Ever. It's a PortingDisaster because the game was never designed for it. There's slowdowns, random crashes, and anti-aliasing beyond belief. That, and it's nearly impossible to play unless you're using an [=iPad=].
** The Wii versions of ''VideoGame/SamAndMaxFreelancePolice'' tend to place the cursor on lower right corner of the screen, thus locking the controls.
* It's very clear that the developers of ''VideoGame/SnakePass'' had no idea what they were doing when attempting to make the game run on other platforms. All they did was pile on the anti-ailiasing until they got it to run at least halfway decently. Did we mention that the game also runs at 30 FPS?
* For a while, ''{{VideoGame/Undertale}}'' went without its Fun Value mechanic only because the developer didn't know that the string that reads the value was case sensitive. This really didn't hinder the game too much as it's still beatable, but a big part of what enhanced the game's replay value weren't incorporated at that point.
* ''VideoGame/CartoonNetworkPunchTimeExplosion'' is a halfway decent ''VideoGame/SuperSmashBros'' clone. Emphasis on "halfway". There's a ton of lazy development quirks that almost make the game unplayable. Unresponsive controls, an unfinished battle system, and an unintuitive control scheme that even the veteran ''Smash'' player finds hard to get used to. There's also a quite infamous bug that [[UnwinnableByMistake freezes the game during one of the final levels]]. And instead of trying to fix the issue, the developers basically told everyone to just cheat by inputting the code that unlocks everything, which makes playing Story Mode pointless. Granted, there are ways to bypass this bug (many people did it just by sheer luck) but it's still a problem that makes the game a textbook example of ObviousBeta.
* ''Videogame/PAYDAY2'' Steam version is pretty good, stable, and consistently and frequently updated... except every time it updates, it doubled the game's total size into well over 60-70gb from its 30-40gb size for backup reason without any explanation beforehand... although when the update finished, the size goes back into its original size.
* The ''WesternAnimation/BluesClues'' games by Creator/HumongousEntertainment are the only games that can't run on Windows 3.1 despite other SCUMM engine games working just fine on it. Why is this? The FullMotionVideo for Steve has filenames that are too long [[note]]Versions of Windows before 95 could only read and write filenames up to eight characters in length [[/note]]. If the files and pointers are shortened manually, [[https://youtu.be/57Tzf374Q_k the games run as expected]].
* Initial releases of ''VideoGame/TheHorde'' were deliberately programed to ''delete all the save files from every other game on the system to make room for itself, without even prompting the player.'' This was changed when it was realized that people don't like this "feature."

[[folder:Hardware That Wears Hard on You]]
You'd think these problems would be ironed out before launch, especially if made by a MegaCorp
* The infamous A20 line. Due to the quirk in how its addressing system worked[[note]](basically, they've skipped on the bounds check there)[[/note]], Intel's 8088/86 [=CPUs=] could theoretically address slightly more than their advertised 1 MB. But because they ''physically'' still had only 20 address pins, the resulting address just wrapped over, so the last 64K of memory actually were the same as first. Some early programmers[[note]]among them, Microsoft; the CALL 5 entry point in MS-DOS relies on this behavior,[[/note]] were, unsurprisingly, stupid enough to use this almost-not-a-bug [[IdiotBall as a feature]]. So, when the 24-bit 80286 rolled in, a problem arose – nothing wrapped any more. In a truly stellar example of a "compatibility is God" thinking, IBM engineers couldn't think up anything better than to simply ''block'' the offending [=21st=] pin (the aforementioned A20 line) on the motherboard side, making the 286 unable to use a solid chunk of its memory above 1 meg until this switch was turned on. This might have been an acceptable (if very clumsy) solution had IBM defaulted to having the A20 line enabled and provided an option to disable it when needed, but instead they decided to have it always turned off unless the OS specifically enables it. By the 386 times, no sane programmer used that "wrapping up" trick any more, but turning the A20 line on is ''still'' among the very first things any PC OS has to do. It wasn't until Intel introduced the Core i7 in 2008 that they finally decided "screw it", and locked the A20 line into being permanently enabled.
* [=AMD=] haven't been all good ideas, either:
** Their wildly successful Athlon Thunderbird ran at high speeds and for a while obliterated everything else on the market, but it was also the hottest CPU ever made up until that point. This wouldn't be so bad in and of itself – even hotter [=CPUs=] were made by both [=AMD=] and Intel in later years – but the Thunderbird was special in that it had no heat-management features whatsoever. If you ran one without the heatsink – or, more plausibly, if the heavy chunk of aluminium sitting on the processor broke the mounting clips through its sheer weight and dropped to the floor of the case – the processor would insta-barbecue itself.
** In late 2006 it was obvious that Intel were determined to pay AMD back for the years of ass-kickings it had endured at the hands of the Athlon 64, by releasing the Core 2 Quad only five months after the Core 2 Duo had turned the performance tables. The Phenom was still some ways off, so AMD responded with the Quad FX, a consumer-oriented dual-processor platform that could mount two dual-core chips (branded as Athlon [=64s=], but actually rebadged Opteron server chips). While re-purposing Opterons for desktop use was something that had worked magnificently three years prior, this time it became obvious that AMD [[DidntThinkThisThrough hadn't thought things through]]. Not only was this set-up more expensive than a Core 2 Quad (the [=CPUs=] and motherboard worked out to about the same price, but you needed twice the memory modules, a more powerful PSU and a copy of Windows XP Professional), but it generally wasn't any faster, and in anything that didn't use all four cores actually tended to be far ''slower'', as Windows XP had no idea how to deal with the two memory pools created by the dual-CPU set-up (Vista was a lot more adept in that department, but had its own set of problems).
** Amazingly enough, things got worse when the Phenom eventually did arrive on the scene. In addition to being clocked far too slow to compete with the Core 2 Quad -- which wasn't really due to any particular design flaw, other than its native quad-core design being a little AwesomeButImpractical -- it turned out that there was a major problem with the chip's translation lookaside buffer (TLB), which could lead to crashes and/or data corruption in certain rare circumstances. Instead of either initiating a full recall or banking on the fact that 98% of users would never encounter this bug, AMD chose a somewhat odd third option and issued a BIOS patch that disabled the TLB altogether, crippling the chip's performance. They soon released Phenoms that didn't have the problem at all, but any slim hope of it succeeding went up in smoke after this fiasco.
*** Interesting enough, they went better with Phenom II, which improved dramatically their performance, going near Intel once again and even more, their 6 core chips were good enough to get some buyers (and even defeating in some uses the 1st generation of Core i7).
** AMD's Bulldozer was something that may make people wonder why they went this route. On the surface, AMD made two integer cores share a floating point unit. This makes some sense, as most operations are integer based. Except those cores share an instruction decoder and scheduler, effectively making a single core with two disjointed pools of execution units. Also, each integer core was weaker than the Phenom II's core. To make matters worse, they also adopted a deep pipeline and high clock frequencies. If anyone paid attention to processor history, those two reasons were the root cause in why the Pentium 4 failed. Still, it was forgivable since they used more cores (up to 8 threads in 4 modules) and higher clock speeds than Intel.
*** However, it went downhill with the AMD Carrizo. It cut the L2 cache which gives enough performance to not to be outmatched by Intel Haswell and they got stuck in 28 nm process, making it worse. Even worse? Fabricants of laptops (which [=AMD=] Carrizo was intended to go) decided to use the worst designs for them, taking their performance to near ''Intel Nehalem'' levels, which was outdated by ''six years''.
* Qualcomm had their own share of failure: The Snapdragon 808 and 810 were very powerful chips at the time (2015) since they were based on the high performance ARM A57 design, but it had a very important disadvantage: it overheats to the point to make it throttle and lose performance! And 3 terminals got hit ''hard'' with this: The LG G4 (with Snapdragon 808), becoming infamous since it dies after just one year; the HTC M9 (with Snapdragon 810), which became infamous for overheating a lot; and the Sony Xperia Z5, for the same reasons as the M9. No wonder why the rest of the competition (Hisilicon and Mediatek) avoided the ARM A57 design.
** Fast forward to the end of 2017 and the Nexus 5X, made by LG and running the Snapdragon 808, is causing more and more complaints of unrecoverable bootloops. The reason? Two of the CPU's six cores - the powerful ones - spontaneously die, for reasons not well understood. Under certain circumstances the phone can be modified so as to run entirely on the remaining four low-power cores; the performance takes a hit, but at least you're not left with a brick. Until, of course, the low-power cores die too, which has been known to happen. You'd think LG would have learned...
* The iRex Digital Reader 1000 had a truly beautiful full-A4 [=eInk=] display... but was otherwise completely useless as a digital reader. It could take more than a minute to boot up, between 5 and 30 seconds to change between pages of a PDF document, and could damage the memory card inserted into it. Also, if the battery drained all the way to nothing, starting to charge it again would cause such a current draw that it would fail to charge (and cause power faults) on any device other than a direct USB-to-mains connector, which was not supplied with the hardware.
* The Coleco Adam. First of all, its power supply went through the printer, meaning it would brick if the printer stopped working. It didn't even have an OS installed – you had to install it with a cassette. This wouldn't be so bad if the computer didn't generate a powerful magnetic field the moment it was switched on. To top it off, owners were advised to start the computers with the tape still in the computer. Little effort was made to fix this beyond a disclaimer on the package itself.
* The UEFI firmware on some Samsung laptops [[http://www.h-online.com/open/news/item/UEFI-on-Samsung-notebooks-Half-full-is-almost-broken-1827493.html chokes and bricks itself]] if the operating system writes too many variables to the firmware's memory. It was soon discovered that "too many" was about 50%.
* Motorola is one of the most ubiquitous producers of commercial two-way radios, so you'd think they'd ironed out issues. Nope, there's a bunch.
** The MTX 9000 line (the "brick" radios) were generally made of Nokiamantium, but they had a huge flaw in the battery clips. The battery was held at the bottom by two flimsy plastic claws and the clips at the top were just slightly thicker than cellophane meaning that the batteries quickly became impossible to hold in without buying a very tight-fitting holster or wrapping rubber bands around it.
** The software to program virtually any Motorola radio, even newer ones, is absolutely ancient. You can only connect via serial port. An actual serial port - USB to serial adapter generally won't work. And the system it's running on has to be basically stone age (Pentium Is from 1993 are generally too fast), meaning that in most radio shops, there's a 486 in the corner just for programming them. Even the new XPR line can't generally be programmed with a computer made after 2005 or so.
*** If you can't find a 486 computer, there's a build of UsefulNotes/DOSBox floating around ham circles with beefed up code to slow down the environment even more than is possible by default. [=MTXs=] were very popular for 900MHz work because, aside from the battery issue, they were tough and cheap to get because of all the public agencies and companies that sold them off in bulk.
* VESA Local Bus. Cards were very long and hard to insert because they needed two ports: the standard ISA and an additional 32 bit bus hardwired to the 486 processor, which caused huge instability and incompatibility problems. Things could get worse if a non-graphic expansion card (usually IO ports) was installed next to a video card, which could result in crashes when games using SVGA graphics accessed the hard drive. The multiple clock frequencies involved imposed high standards on the construction of the cards in order to avoid further issues. All these problems eventually caused the 486-bus-dependent VLB to be replaced by PCI, starting from late-development 486 boards onwards into the Pentium era.
* The Radio Shack TRS-80 (model 1) had its share of hardware defects:
** The timing loop constant in the keyboard debounce routine was too small. This caused the keys to "bounce" -- one keypress would sometimes result in 2 of that character being input.
** The timing loop constant in the tape input routine was wrong. This made the volume setting on the cassette player extremely critical. This problem could somewhat be alleviated by placing an AM radio next to the computer and tuning it to the RFI generated by the tape input circuit, then adjusting the volume control on the tape player for the purest tone from the radio. Radio Shack eventually offered a free hardware modification that conditioned the signal from the tape player to make the volume setting less critical.
** Instead of using an off-the-shelf Character Generator chip in the video circuit, RS had a custom CG chip programmed, with arrow characters instead of 4 of the least-used ASCII characters. But they made a mistake and positioned the lowercase "a" at the top of the character cell instead of at the baseline. Instead of wasting the initial production run of chips and ordering new chips, they eliminated one of the video-memory chips, added some gates to "fold" the lowercase characters into the uppercase characters, and modified the video driver software to accommodate this. Hobbyists with electronics skills were able to add the missing video memory chip, disconnect the added gates and patch the video driver software to properly display lowercase, albeit with "flying a's". The software patch would have to be reloaded every time the computer was booted. Radio Shack eventually came out with an "official" version of this mod which included a correctly programmed CG chip.
** The biggest flaw in the Model 1 was the lack of gold plating on the edge connector for the Expansion Interface. Two-thirds of the RAM in a fully expanded TRS-80 was in the EI, and the bare copper contact fingers on the edge connector oxidized readily, resulting in very unreliable operation. It was often necessary to shut off the computer and clean the contacts several times per day. At least one vendor offered a "gold plug", which was a properly gold-plated edge connector which could be soldered onto the original edge connector, eliminating this problem.
** In addition, the motherboard-to-EI cable was very sensitive to noise and signal degradation, which also tended to cause random crashes and reboots. RS attempted to fix this by using a "buffered cable" to connect the EI to the computer. It helped some, but not enough. They then tried separating the 3 critical memory-timing signals into a separate shielded cable (the "DIN plug" cable), but this still wasn't enough. They eventually redesigned the EI circuit board to use only 1 memory timing signal, but that caused problems for some of the unofficial "speed-up" mods that were becoming popular with hobbyists.
** The Floppy Disk Controller chip used in the Model I EI could only read and write Single Density disks. Soon afterwards a new FDC chip became available which could read and write Double Density (a more efficient encoding method that packs 80% more data in the same space). The new FDC chip was ''almost'' pin-compatible with the old one, but not quite. One of the values written to the header of each data sector on the disk was a 2-bit value called the "Data Address Mark". 2 pins on the single-density FDC chip were used to specify this value. As there were no spare pins available on the DD FDC chip, one of these pins was reassigned as the "density select" pin. Therefore the DD FDC chip could only write the first 2 of the 4 possible DAM values. Guess which value TRS-DOS used? Several companies (starting with Percom, and eventually even Radio Shack themselves) offered "doubler" adapters -- a small circuit board containing sockets for ''both'' FDC chips! To install the doubler, you had to remove the SD FDC chip from the EI, plug it into the empty socket on the doubler PCB, then plug the doubler into the vacated FDC socket in the EI. Logic on the doubler board would select the correct FDC chip.
* The TRS-80 model II (a "business" computer using 8-inch floppy disks) had a built-in video monitor with a truly fatal flaw. The sweep signals used to deflect the electron beam in the CRT were generated from a programmable timer chip. When the computer booted, one of the first things it would do is write the correct timer constants to the CRTC chip. However, an errant program could accidentally write any other values to the CRTC chip, which would throw the sweep frequencies way off. The horizontal sweep circuit was designed to operate properly at just one frequency and will "send up smoke signals" if operated at a frequency significantly different than what it was designed to operate at. If your screen goes blank and you hear a loud high-pitched whine from the computer, shut the power off ''immediately'', as it only takes a few ''seconds'' to destroy some rather expensive components in the monitor.
* Nvidia's early history is interesting… in the same way a train wreck is. There's a reason why their first 3D chipset, the [=NV1=], barely gets a passing note in the official company history page. See, the [=NV1=] was a weird chip which they put on an oddball – even for the times – hybrid card meant to let you play specially ported Sega Saturn games on the PC. The chip's weirdness came from its quadratic primitives, when everybody else used, and has ever used, triangles. Developing for a quad-supporting chip was complicated, and porting previously existent triangle games to quads wasn't all that better either, so the [=NV1=] was wildly unpopular from the start. Additionally, the hybrid cards integrated other features (such as MIDI playback) that weren't needed and increased cost and complexity. When Microsoft came out with [=Direct3D=] it effectively killed the [=NV1=], as it was all but incompatible with it. Nvidia stubbornly went on to design the [=NV2=], still with quad mapping, intending to put it in the Dreamcast – but then Sega saw the writing on the wall, told Nvidia "thanks but no thanks" and used a Nec [=PowerVR=] instead. Nvidia finally saw the light, dropped quads altogether and came out with the Riva 128, which was a decent hit and propelled them onto the scene – probably with great sighs of relief from the shareholders.
* Anytime your hardware has a firmware update, but the update utility is a Windows Executable and the system is running another OS such as Macintosh or any flavor of Linux. To do the firmware flash the right way, you'll probably have to remove the hardware, hook it up to a Windows machine and then run from there. To be fair, it is a reasonable guess that the system will be running Windows, but issues like this can fan the flames of anti-Microsoft sentiment among alternative OS markets (like Debian-Linux derivatives). In the 1990's, this problem was usually averted by having the firmware utility create a bootable DOS-like system with the flashing tool on the disc.
* Improvement in low-power processor manufacture by Intel – namely the Bay Trail-T system-on-a-chip architecture – have now made it possible to manufacture a honest-to-goodness x86 computer running full-blown Windows 8.1 and with moderate gaming capabilities in a box the size of a book. Cue a whole lot of confounded Chinese manufacturers using the same design standards they used on Arm systems-on-a-chip to build Intel ones, sometimes using cases with nary a single air hole and often emphasizing the lack of need for bulky heatsinks and noisy fans. Problem: You ''do'' actually need heat sinking on Intel [=SoCs=], especially if you're going to pump them for all the performance they're capable of (which you will, if you use them for gaming or high-res video playback). Without a finned heatsink and/or fan moving air around, they'll just throttle down to crawling speed and frustrate the users.
* Back in the early days of 3D Graphics cards, when they were called 3D Accelerators, and even 3Dfx hadn't found their stride, there was the S3 Virge. The card had good 2D performance, but such a weak 3D chip that at least one reviewer called it, with good reason, the world's first 3D Decelerator. That epithet is pretty much ExactlyWhatItSaysOnTheTin, as 3D games performed worse on [=PCs=] with an S3 Virge installed than they did in software mode, i.e. with no 3D acceleration at all.
* The "Home Hub" series of routers provided by UK telecoms giant BT are fairly capable devices for the most part, especially considering that they usually come free to new customers. Unfortunately, they suffer from a serious flaw in that they expect to be able to use Wi-Fi channels 1, 5 or 11, which are naturally very crowded considering the ubiquity of home Wi-Fi, and BT's routers in particular. And when that happens, the routers will endlessly rescan in an effort to get better speeds, knocking out your internet connection for 10-30 seconds every 20 minutes or so. Sure, you can manually force the router into using another, uncongested channel... except that it'll keep rescanning based on how congested channels 1, 5 and 11, even if there are ''no devices whatsoever'' on the channel that you set manually. Even BT's own advice is to use ethernet (and a powerline adapter if needed) for anything that you actually need a rock-solid connection on.

[[folder:Instant Messengers That Need To Get The Message]]
* The new Yahoo! Instant Messenger was a disaster. While tech sites all agreed that the older version was definately showing its age in 2016, whatever bile they aimed at it could not ''hope'' to compare with the instant loathing that people had for the new app--which, amongst other woes, ''didn't tell you if you've been messaged nor who is online!'' The comment sections on articles announcing the release of the new messenger were filled with vows from people to never use Instant Messenger again, as two features that no chat program could seemingly do without were done without. Oh, and gone were a host of emoticons people had grown to love.

[[folder:Mass-Storage Mishaps]]
Watch out, these storage techs may be harmful to your data, a pain to use, or drain your money due lack of adoption.
* The Commodore 64, one of the most popular computers of all time, wasn't without its share of problems. Perhaps the most widely known is its ''extreme'' slowness at loading programs. This couldn't really be helped with a storage medium like tape, which remained slow even after various clever solutions to speed it up, but floppy disks really ought to have been faster. What happened was that Commodore had devised a hardware-accelerated system for transferring data that worked fairly well, but then also found a hardware bug in the input/output chip that made it work not at all. Replacing the buggy chips was economically unfeasible, so the whole thing was revised to [[https://en.wikipedia.org/wiki/Bit_banging work entirely in software]]. This slowed down drive access immensely and caused the birth of a cottage industry for speeder carts, replacement ROM chips and fastloaders, most of which sped things up at least fivefold. Additionally the drive itself had a CPU and some RAM to spare -- effectively a secondary computer dedicated to the sole task of feeding data to the primary computer (hence its phenomenal cost) -- so it was programmable, and people came up with their own ways to improve things further. Eventually non-standard formats were developed that loaded ''25 times as fast as normal''.
* Why, after the introduction of integrated controllers into every other storage device, does the floppy have to be controlled by the motherboard? Sure, it makes the floppy drive simpler to manufacture, but you're left with a motherboard that only knows how to operate a spinning mass of magnetic material. Try making a floppy "emulator" that actually uses flash storage, and you'll run into this nigh-impassible obstacle.
** The floppy drive interface design made sense when it was designed (first PC hard drives also used a similar interface) and was later kept for backward compatibility. However, a lot of motherboards also support IDE floppy drives (There may not have been any actual IDE floppy drives, but a [=LS120=] drive identifies itself as floppy drive and can read regular 3.5" floppy disks), a SCSI or USB device can also identify as floppy drive. On the other hand, the floppy interface is quite simple if you want to make your own floppy drive emulator.
* Sony's [=HiFD=] "floptical" drive system. The Zip Drive and the LS-120 Superdrive had already attempted to displace the aging [=1.44MB=] floppy, but many predicted that the [=HiFD=] would be the real deal. At least until it turned out that Sony had utterly screwed up the [=HiFD's=] write head design, which caused performance degradation, hard crashes, data corruption, and all sorts of other nasty problems. They took the drive off the market, then bought it back a year later... in a new 200MB version that was totally incompatible with disks used by the original 150MB version (and [=720KB=] floppies as well), since the original [=HiFD=] design was so badly messed up that they couldn't maintain compatibility and make the succeeding version actually work. Sony has made a lot of weird, proprietary formats that have failed to take off for whatever reason, but the [=HiFD=] has to go down as the worst of the lot.
* The IBM Deskstar 75GXP, nicknamed the [[FanNickname Death Star]]. While it was a large drive by year-2000 standards, it had a disturbing habit of suddenly failing, taking your data with it. The magnetic coating was of subpar reliability, and came loose easily, causing head crashing that easily strips the magnetic layer off clean. One user with a RAID server setup reported to their RAID controller manufacturer; supposedly, this user was replacing their IBM deskstars [[http://s3.computerhistory.org/groups/ds-ibm-75gxp-family-20121031.pdf at a rate of 600-800 drives per day]]. There have been many hard drives that have been criticized for various reasons, but the "Death Star" was something truly spectacular for all the wrong reasons.\\
There is anecdotal evidence that IBM was even engaging in deception, knowingly selling faulty products, and then spewing out rhetoric about the industry-standard failure rates of hard drives. This denial strategy started a chain reaction that led to a demise in customer confidence. Class action lawsuits helped convince IBM to sell their hard drive division to Hitachi in 2002. [[http://www.pcworld.co.nz/article/157888/25_worst_tech_products_all_time/ (See "Top 25 Worst Tech Products of All Time" for this and more.)]]
* The Iomega Zip Disk was a big success undeniably, but user confidence in the drives' reliability was terrorized by the "Click-of-death". Though tens-of-millions of the drives were sold, there were thousands of drive that would suffer misalignment and damage the medium injected into the drive. This would not be horrible by itself necessarily, but Iomega made a big mistake, downplaying the users who complained about drive failures and failing to be sensitive about their lost data.\\
The Zip's worst problem wasn't even the fact that it could fail and potentially ruin a disk, but that such a ruined disk would go on to ruin whatever drive it was then inserted into. Which would then ruin more disks, which would ruin more drives, et cetera. Effectively a sort of hardware virus, it turned one of the best selling points of the platform (inter-drive compatibility) into its worst point of failure.\\
After a class-action lawsuit 1998, Iomega issued rebates in 2001 for future products. It was too little, too late, and CD-R disks were now more popular for mass storage and perceived as more reliable.[[http://www.pcworld.co.nz/article/157888/25_worst_tech_products_all_time/ The New Zealand site for PC World has the original article still available.]]
** Iomega's previous magnetic storage product, Bernoulli Box, was designed to avert this disaster by using a specific law of physics that makes it [[GeniusProgramming physically impossible for the drive head to make contact with the medium.]] When Iomega designed the Zip disk specification, the Bernoulli effect was overlooked, likely to save costs. Yes, Iomega already had a disk format that was designed to prevent the failures the Zip drives suffered.
* Maxtor, now defunct, once sold a line of external hard drives under the OneTouch label. However, the USB 2.0 interface would often malfunction and corrupt the filesystem on drive, rendering the data hard to recover. You were better off removing the drive enclosure and installing the disk on a spare SATA connection on a motherboard. Not surprisingly, Maxtor was having financial troubles already, before Seagate acquired them.
* The 3M Superdisk and its proprietary 120MB "floptical" media were intended as direct competition to the Iomega Zip, but in order to penetrate a market that Iomega owned pretty tightly the Superdisk needed a special feature to push it ahead. That feature was the possibility to write up to 32 megabytes on a bog-standard 1.44MB floppy, using lasers for alignment of the heads. Back then 32MB was significant storage, and people really liked the idea of recycling existing floppy stock -- of which everybody had large amounts -- into high-capacity media. The feature might just have given the Superdisk the edge it needed; unfortunately what wasn't immediately clear, nor explicitly stated, was that the drive was only able to do random writes on its specific 120MB disks. It could indeed write 32MB on floppies, but only if you rewrote '''all''' the data every time a change, no matter how small, was made – basically like a CD-RW disk with no packet-writing system. This took a relatively long time, and transformed the feature into a gimmick. Disappointment ensued, and the format didn't even dent Iomega's empire before disappearing.
* The Caleb UHD-144 was an attempt to gain a foothold in the floppy-replacement market. Unfortunately, it was ill-timed, the company not taking a hint from the failures of Sony and 3M, so there was no chance to see the product in action. The Zip-250 and inexpensive CD-R media caused the technology to be dead on arrival; A tragic example of a "good" idea rushed to market without checking for what the competition has to offer. (The Zip-250 itself was quickly marginalized by cost-effective CD-R discs that were designed to be read in optical drives already present in numerous computers.)
* Some DVD Video players, especially some off-brand models seem to occasionally decide that the disc that you have inserted is not valid. The user ejects the disc and then injects it again and hopefully the DVD player decides to cooperate. This can be a headache if the machine is finicky about disc defects due to copy protection, or can't deal with your brand of DVD -/+ recordable disc that you use for your custom films. Bonus points if you have to crack a copy-protected disc to burn it onto a blank DVD because [[DigitalPiracyIsEvil you can't watch the master copy.]] The inverse situation is also possible, where you have a DVD player made by a "reputable" brand, and even that can't allow you to watch the locked-down DVD you just spent money for.
** Some DVD players are overly cautious about the discs it's willing to play because of regional lockout. Live in Australia and have a legally purchased Region 4 DVD? Turns out it was an NTSC disc, and your DVD player is only willing to play PAL discs. Oops.
* After Solid-State Drives started taking over from mechanical hard drives as the storage device of choice for high-end users, it quickly became obvious that the transfer speeds would soon be bottlenecked by the speed of the Serial ATA standard, and that PCI Express was the obvious solution. Using it in the form of full-sized cards wasn't exactly optimal, though, and the smaller M.2 form factor is thermally limited and can be fiddly to install cards in. The chipset industry's answer was SATA Express, a clunky solution which required manufacturers to synchronise data transfers over two lanes of PCI Express and two SATA ports, standards with ''completely'' different ways of working. Just to make it even worse, the cable was an ugly mess consisting of four separate wires (two SATA, one PCI-E, and a SATA power connector that hung off the end of the cable). The end result was one of the most resounding failures of an industry standard in computing history, as a grand total of ''zero'' storage products made use of it (albeit a couple of manufacturers jury-rigged it into a way of connecting front-panel [=USB3.1=] ports), with SSD manufacturers instead flocking to the SFF-8639 (later renamed U.2) connector, essentially just four PCI-E lanes crammed into a simple cable.
* While there's many things that set back the NES Classic Edition (stock issues, inconvenient design choices, tons of alternatives) one glaring flaw it has is its hard drive. Many fans have calculated that the system could easily hold the NES' entire library 10 times over including box art and metadata, and yet it only uses a small fraction of it. While there's many people who would have put the empty space to good use (it's very easy to hack the thing by simply flashing the hard drive and putting whatever you want on it), one has to wonder why they gave it such a massive hard drive only to do next to nothing with it, especially when the games that are already on it are [[SoOkayItsAverage unimpressive as it is]].

[[folder:Miscellaneous Madness]]
Needless to say, there are more bad coders out there than good ones.
* You can't talk about idiot programming without mentioning one of the most infamous software glitches of all time. Most of the glitches on this page probably caused inconveniences, lost data, or locked someone out of using a product. But what about a software glitch that actually ''killed people''? That would involve none other than the infamous [[https://en.wikipedia.org/wiki/Therac-25 Therac-25]], a piece of radiology equipment so poorly programmed that it gave everyone who used it massive (and often quite fatal) overdoses of radiation. The Other Wiki's article on it describes it as "involved in at least six accidents between 1985 and 1987, in which patients were given massive overdoses of radiation. Because of concurrent programming errors, it sometimes gave its patients radiation doses that were hundreds of times greater than normal, resulting in death or serious injury. These accidents highlighted the dangers of software control of safety-critical systems, and they have become a standard case study in health informatics and software engineering. Additionally the overconfidence of the engineers and lack of proper due diligence to resolve reported software bugs, is highlighted as an extreme case where the engineer's overconfidence in their initial work and failure to believe the end users' claims caused drastic repercussions." This machine's shoddy programming was so infamous that it is now cited in introductory programming classes to this day as an example of how '''not''' to program.
* Actual code that causes these observed effects is a weekly feature at [[http://www.thedailywtf.com thedailywtf.com]].
** Also from the programming side, libraries developed by the DepartmentOfRedundancyDepartment where you have to lapse into PokemonSpeak to write any meaningful code. For instance, take this line from Debian's version of [[http://awesome.naquadah.org/ awesome's]] rc.lua:
---> @@[={ "Debian", debian.menu.Debian_menu.Debian }=]@@
*** To be clear, "menu" is the ''only'' member of "debian", and "Debian_menu" ''is the only member of "debian.menu"''.
** Java ''should'' have filled the niche of web-based games that Flash mostly owns... except that early versions of Java were so slow and so unnatural looking that Flash actually looked good in comparison. By the time they fixed it, Flash had become the de-facto standard for this kind of thing, much to the chagrin of just about everyone except Adobe.
* The [[http://rinkworks.com/stupid/ Computer Stupidities section of Rinkworks.com]] has its own [[http://rinkworks.com/stupid/cs_programming.shtml section for programming]].
* Adding another Sony example to the lot already present on this page: when they first started out with their portable music players, Sony didn't support the MP3 standard due to their historical unwillingness to support anything that could encourage piracy of any kind. Their players instead supported [=Atrac3=], a proprietary Sony audio format. Being (somewhat surprisingly) smart enough to figure out that users would want to listen to their [=MP3=] music, Sony sold the players with [=SonicStage=], an upload program capable of converting MP3 files to Atrac. [=SonicStage=] promptly proceeded to annoy a whole lot of people: buggy and prone to crashing, some computers couldn't run it at all (for reasons unknown), prompting many to return the players and switch brands.
** The effective demise of [=SonicStage=] and unpopularity of early Sony players ended up killing off the [=Atrac3=] format, which -- despite its ''very'' proprietary design -- was liked by some regardless of the players it supported due to somewhat better quality compared to equivalent-bandwidth MP3.
** Creative did the same with their first hard-disk players: before they started supporting the [=MTP=] format (widely supported by many music managers), the only way you could upload stuff to them was by using the godawful [=PlayCenter=] program, later superseded by the even worse [=MediaSource=]. Many users preferred to keep [=PlayCenter=]: buggy as it was, at least it did its job ''sometimes''. Both programs also attempted to set themselves as default players and music managers, further irritating users.
** Ditto Microsoft with the Zune, at least initially. Read the Microsoft section for more.
* Many streaming video websites have a ridiculously small buffer size; quite often only 3 to 4 seconds of video, and possibly even less at HD resolutions. In optimum conditions, it's not so problematic, but if the site is particularly busy or if your Internet connection isn't too fast, it can make the videos all but unwatchable. A part of this problem is actually politics of internet service providers. The content providers ([=YouTube=] and Netflix for example) and service providers are in heated debates about who should fork over dough for the upkeep of the network. When this fails to make any progress, certain aspects of how videos get to your home suffer. It's not that content providers suck, it's that they have to hop through more loops to get to you. You can read more about it at [[http://arstechnica.com/information-technology/2013/07/why-youtube-buffers-the-secret-deals-that-make-and-break-online-video/ this article]].
** An additional problem caused by the introduction of ad breaks on certain longer videos, both for the commercial networks and independent web producers, is an annoying tendency for the players to just stop working after an ad break and never go back to the video originally being watched. Fortunately most sites are kind enough to remember where you were in the video and just pick up after the last commercial break, but several -- which, annoyingly, includes Blip.tv -- will force you to restart from the beginning. Which means sitting through some (if not all) of the ads ''again''.
* Computers don't truly have random number generation. Instead, they take a number (often something like the system time) as a "seed" and use that to generate a stream of random-looking numbers (it's called a "pseudo-random number generator", or PRNG), and a given seed will ''always'' produce the same sequence each time it's used. This isn't Idiot Programming itself -- the way computers work means that they ''can't'' actually generate proper random numbers without an outside source of randomness -- but considering that this fact is mentioned in every basic computer science class worth its salt, any code that fails to take it into consideration ''is'' Idiot Programming.
** For a specific example, [[http://www.cigital.com/papers/download/developer_gambling.php at least one poker site was cracked this way]]. Someone wrote a program that, using the programmer's knowledge of the RNG and the player's own pocket cards, could tell the player everyone else's cards as well as the future cards that would come off the deck, [[GameBreaker completely breaking the game]].
** Another example was found in a keno machine which would always roll the same sequence of numbers after each reboot, since it used the same seed every time it booted up. This allowed [[https://en.wikipedia.org/wiki/Montreal_Casino#Keno_scandal one clever guy to win big three games in a row]]. Slot and keno machines nowadays prevent this by constantly advancing the seed even while it's not being played, by either incrementing the seed number or just rolling and throwing away the outcomes[[note]]and even this must be done very carefully, as some of the more popular ways of implementing it can actually ''make the problem worse'' due to "RNG jitter"[[/note]].
** RANDU, the infamous random number generator included on IBM computers in the 60s. How bad is it? Aside from the fact that every number is odd (which is very very bad on its own yet easy to work around), any three adjacently generated numbers can be mathematically related into [[https://en.wikipedia.org/wiki/File:Randu.png a plane]] (which looks like 15 planes when plotted due to modulo arithmetic).
*** And because the computer it was included with, System/360 [[UsefulNotes/MainframesAndMinicomputers mainframe]], is widely regarded as their greatest work and was ''the'' computer of TheSixties and TheSeventies, the generator became so widespread that the traces of it periodically surface even '''now''', almost fifty years past.
** Creator/{{Sega}}'s 1988 arcade version of ''VideoGame/{{Tetris}}'' uses the exact same RNG seed every time it boots up, resulting in the infamous [[http://tetrisconcept.net/threads/wheres-the-sega-tetris-poweron-pattern.680/#post-23443 "power-on pattern"]]. This goes against a major concept of ''Tetris'' (randomized pieces) and allows the player to simply [[GameplayDerailment use the pattern to plot out their piece placements to]] [[GameBreaker max out the score counter in as few pieces as possible]], rather than playing against an RNG seed they don't know and dealing with whatever it throws out. Of course, you need to have machine-resetting privileges or be playing on an emulator to take advantage of this.
*** To be fair, the original arcade stored the seed in NVRAM, so resetting wouldn't be enough to get the pattern back. You would need to enter setup and clear the NVRAM, which arcade players could not do. But, like nearly every other Sega arcade game back then, it got bootlegged a lot, and the bootleggers left off the NVRAM, meaning the players COULD reset the pattern after all. Additionally, it generated a 1000 piece looping sequence, allowing for playing forever if you figured out how to handle the wrap around, instead of simply advancing the seed every piece, which would have been easier. An alternate arcade version of Sega Tetris for different hardware DID generate pieces as needed instead of a looping sequence, but that hardware didn't have NVRAM, so it ALSO had a power-on pattern.
** Generally speaking, using the RNG to advance the RNG makes it ''less'' random, not more, as multiple positions will end up having the same result. This mistake is common enough to have its own name -- "RNG jitter". One of the programs guilty of it is ''VideoGame/NetHack'', as constantly mentioned in [[http://tasvideos.org/5085S.html this]] AprilFoolsDay [[SpeedRun TAS]] submission.
** Spectrum computers will always generate the random number .0011291504 the first time a random number is called after booting up. This can be avoided by coding RANDOMIZE to reroll before calling a random number, but the low quality control of the Spectrum's output means many, many programs do not bother to avoid it. It is such a cliche to see .0011291504 appear in bad Speccy games that there are {{Deconstruction Game}}s referencing it, such as ''[[http://www.sqij.co.uk/csscgc2015/blog/2015/03/11/randomize-randomize-randomize-randomize-randomize-randomize-randomize-randomize-randomize-randomize/ RANDOMIZE RANDOMIZE RANDOMIZE RANDOMIZE RANDOMIZE RANDOMIZE RANDOMIZE RANDOMIZE RANDOMIZE RANDOMIZE]]''.
* For graphing calculators, it often happens that there exist several different hardware for linking them to computers. It also happens that different linking software, from different authors, don't support the same hardware. But unlike Texas Instruments calculators, linking applications for Casio calculators ''all used different, incompatible file formats on the PC''.
* While Windows Vista (see below) did introduce a ton of problems, it also did something that revealed many a programmer's idiot programming choice: assuming that the user account always had administrative rights. In Windows Vista, Microsoft introduced UAC, which would only assign standard user rights to a program, even if the user was an administrator. This is sensible, as it limits the damage that the program can do if it goes rogue. Programs that needed administrator rights were detected based on the file name and an optional configuration file called a manifest. Of course, older software that needed administrator rights knew nothing of manifests, and would fail in unpredictable ways, usually spouting an error message that wouldn't make the actual problem obvious to the non-technical (or necessarily even to the technical) -- although Windows did sometimes spout a dialogue box along the lines of "Whoops, this program looks like it needs admin rights, but it didn't ask for them and I didn't realize until just now, do you want me to make sure it runs as an admin in future?".
* ''[=BitTorrent=]'', since 2004, still has the issue of "stopping the Internet", usually due to overloading the DNS cache. It's convenient to have another device, another connection, or access to another computer to look up the solution to this problem, attempting steps like resetting the router from the router itself, accessing the router software interface and resetting the router from there, flushing the DNS cache, and adjusting settings in the OS, firewall, and [=BitTorrent client=]. Wherever the problem is, such as the process of manufacturing router hardware and programming router software, you'd think it would be fixed by now so the end user would not have to deal with it.
* So the designers of the Soviet ''Phobos'' space probe left testing routines in the flight computer's ROM -- fair enough, everyone does the same, because removing them means retesting and recertifying the whole computer, which generally would've be plainly ''impossible'' without said routines. But to design the probe's UsefulNotes/OperatingSystem in such a way that a one-character typo in an incoming command would ''accidentally'' trigger a routine [[FateWorseThanDeath that turns off the attitude thrusters]], making the spacecraft unable to point its solar panels at the Sun and recharge its batteries, effectively killing it, takes a special kind of failure.
* Firefox:
** Firefox 4's temporary file deletion algorithm was an unusual, since it was actually pretty effective at deleting older files and freeing up large amounts of disk space. It suffered a major problem, though, in that it chewed up huge amounts of CPU power and maxed out the hard drive in the process, which could slow your entire system to a crawl. Worse still, there was no way of aborting the cleanup routine, and if you killed the Firefox process, it would just invoke the file cleaner again as soon as you restarted the browser. It wasn't until Firefox 5 that the file cleaner got fixed, using much less CPU power and still being fairly disk-intensive, but not to the same extent as previously.
** A few years back, Firefox had an update that changed the system for syncing your passwords. This allowed recovery of accounts in case of losing your master decoder key, but had an outrageous problem: You could no long sync passwords with a master password set on your Firefox user, requiring the user to disable it and then check-off passwords in the sync list. Thankfully, Mozilla updated this problem, and all is well again.
** Syncing Firefox accounts makes it very convenient to keep your computers up to date with each other, but syncing for the first time causes the browser to eat CPU cycles -- for a while -- like narcotics. All this just to download data from a server and import it into the browser?
* Tech sites have noted a rather disturbing trend in how certain handheld devices handle firmware updates. The sane way to do such an update over the internet is to check for the existence of updated firmware, download it, validate its checksums, erase the old firmware, and then load the updated version. Ideally there's also a backup firmware chip, or some other way of restoring the device if things go pear-shaped. Unfortunately, a lot of devices (especially cheaper ones) don't actually do that -- instead, they check for a firmware update, and upon getting confirmation that that there is such an update, the device ''immediately'' wipes the old firmware, ''then'' downloads and installs the updated version. If anything goes significantly wrong during the download (e.g. loss of internet connection, loss of power, or a software error), then the device will almost certainly be bricked. On top of that, most of the time there's no way to restore such a device to working order outside of replacing the motherboard, and only a 50/50 or so chance the manufacturer will replace it under warranty.
** Curiously, most so-called "smart [=TVs=]" will do this -- Samsung and Sony seem to be particularly bad about it.
* A flight of ultra-high-tech F-22 Raptors suffered multiple computer failures and were practically crippled because their programming couldn't cope with the time zone change of crossing the International Date Line. Somehow, it never occurred to the designers that a fighter ''aircraft'' just might cross the International Date Line and forgot to program its systems to adjust for it -- which is a standard part of the programming of modern ''cellphones''. This oversight resulted in a temporary grounding of all Raptors for a time.
* This is done ''InUniverse'' in ''Fanfic/CalvinAndHobbesTheSeries''. Calvin's time machine is set to power down when it's almost dead.
** And it has to ''load'' when it refuels.
* Many, many pieces of PC gaming hardware feature extremely fancy graphical interfaces and special effects for their drivers. Although they might make the hardware look attractive for the few minutes people will spend setting them up, this also has the effect of consuming extra system resources and thus interferes with the actual games the user might want to play.
** The Razer Starcraft II series of hardware were perhaps the worst example of this. The Razer Spectre mouse, made for Starcraft II, had such an intrusive driver that a patch had to be issued for ''VideoGame/StarCraftII'' itself to prevent the slowdown it caused. The Razer Marauder keyboard, also made for Starcraft II, not only used the same intrusive driver but when Razer sponsored their own team for Starcraft II, the keyboards they issued them with were... Razer Blackwidows.
* At some point, the client software for AOL was so intrusive that a page on horrible software (now part of the Administrivia/PermanentRedLinkClub) said that you shouldn't download it, and that the only way to get rid of it was to use a Live CD with support for NTFS partitions.
* UEFI in modern computers has become commonplace and it offers many benefits over the old BIOS such as support for better interfaces, so configurations can be easily done and none of the old limitations from the [=80s=] like being limited to 640KB of RAM. Hell, some computers don't even have so much as the old POST screen anymore. Thanks to the "compatibility support module" found in most UEFI firmware, it can still be used with older operating systems emulating a BIOS. But of course, UEFI mode causes issues once in a while due to shoddy coding, e.g., where [[http://www.h-online.com/open/news/item/Samsung-UEFI-bug-Notebook-bricked-from-Windows-1801439.html filling the UEFI memory with too many variables]] (which could be triggered by ''simply running Linux'') bricks your laptop.
** One issue with UEFI being generally prettified in comparison to BIOS is that some manufacturers remove the old POST screen without replacing it with anything but a non-removable manufacturer logo. The average user is unlikely to mind, but it's a massive pain to any tech who used to rely on the POST messages to figure out the hardware, and potential early errors, without having to load the OS and from there a hardware-scanning application.
** Software commands destroying hardware was a problem we'd managed to remove from the world of computing as a whole in the eighties -- count on the UEFI trainwreck to bring it back. Namely, the Linux command "rm -rf /" has now acquired a whole new level of danger: with BIOS all this did was wipe your system (and was occasionally ran for kicks and giggles by Linux admins before reinstalling a system, because it can be fun to see the system progressively lose its brains), but with UEFI the motherboard firmware is ''mounted as read-write in the filesystem'', and many motherboards ''give write privileges to the firmware by default'' -- so running that command will also '''brick your motherboard'''. Slow clap.
*** But no one keeps that mounted, right? That would be almost as stupid as keeping /boot mounted.
* vBulletin 5 has this in spades. Thousands of bugs? Check. Bugs that let members do things like figure out what private sections exist, not search/view more pages/post if [=JavaScript=] is disabled? Check. Changing every aspect of the URL structure on upgrade in one fell swoop and completely mauling the site's search engine rankings? Check again. There's even [[http://vbtruth.com/vbulletin-5-code-review-by-rafio/460/ a code review]] that breaks down the poorly programming line by line that's filled with examples of Idiot Programming and inconsistency.
** vBulletin 4 wasn't that great either; the old developers fled the coop after they somehow thought selling out to Internet Brands was a good idea. All hell broke loose.
* Zamfoo. This web hosting control panel software, as [[http://www.webhostingtalk.com/showthread.php?s=62ec0ef8a7e5a2050a51b7f0d21f62bc&t=1275572 discussed at Webhosting Talk]] and [[http://www.reddit.com/r/programming/comments/1gfve8/how_not_to_handle_a_critical_security/ Reddit]] is an absolute goldmine of examples of how NOT to program anything in existence. Some examples:
** The 'easy upgrade' script has you enter your server root login information on the company's website, which is then [[HighlyVisiblePassword transmitted as plain text through HTTP]].
** The code (which is supposedly encrypted) is devoid of any logic or decent coding sense in the slightest, doing things like enabling and then disabling "strict mode" to "fix" errors and and using only the most basic programming code possible.
** The released updates were literally hacked in about five minutes flat, and didn't fix any of the issues properly.
** Add "support" which amounts to threats and personal attacks, the use of nulled (stolen) software by the coder and the chance that the whole thing could well have unlicensed code from other software in it, and the whole thing is literally a disaster in every way possible.
* Want to register on Finale [=PrintMusic=]'s forums (or certain others, such as the Magic Set Editor ones)? We hope you don't plan on being away from your account for too long, because it will deactivate after a few months and you can't log back in. You can't simply create a new account, either, because the old information will still be in the system and it won't allow you to use the same information for another account. You can contact a forum administrator and have them reactivate the old one, but you need to view the admin's profile to gain their contact information... and you can only view user profiles if you're logged into an account. The Catch-22 should be obvious.
** For Finale's programs themselves, in ''Finale Notepad 2003'', there's no limit as to how high or low you can put a music note on the music sheet, but be warned: placing a note very high or very low on a music sheet can sometimes freeze ''the whole entire system'', with the only solution being to force shut down your computer with the power button.
* Oracle's [=VirtualBox=] (like many of Oracle's products) has gotten a lot of flack because of extremely poor programming. Version 4.3.14 is especially bad; despite the release notes saying it doesn't need to restart a computer running Windows after installing, it still does. After restarting, as with certain previous versions, guest operating systems won't be able to boot. Unlike previous versions, before saying something vague about why the virtual machine won't start, it gives the user another error message, which is written in hex or something. After attempting to repair the installation (which actually worked in previous versions) and restarting again, this time [=VirtualBox=] itself won't start, instead greeting the user with the same indecipherable error message.
** [=VirtualBox=] also conflicts with Microsoft Hyper-V. This isn't idiot programming in itself, since the nature of virtualization means that you can only use one such program at once. What's idiot programming is how it deals with the conflict: namely, it blue screens the entire computer. Even more ironically, if you turn off Hyper-V and then run [=VirtualBox=], it'll still blue screen.. and then Windows will ''reinstall Hyper-V'', because since the system blue screened, Windows's recovery features will undo the change to the operating system that they believe caused the fault.
* In general, any and all interfaces made for specific gadgets and intended to look non-threatening to the end user. This is less of a problem today than it used to be, thanks to universal interfaces such as USB, MTP and various other platform-agnostic protocols that let you just plug in a device and see its contents, upload to it, or just have it work with little-to-no attention required. Back when this stuff wasn't so well-baked, though, it was common to receive a nice software package with, say, your camera, and you needed that to get the pictures -- which might well be in some proprietary format, or even encrypted -- out of the device's memory and into your computer. The software was almost always Windows-only, natch, and typically written by someone who would have lost a programming competition against a trained monkey. So you'd get horrors like [[http://www.dansdata.com/images/quickshot001/interface.jpg this]], which might -- if you were particularly lucky -- actually let you ignore all the family-friendly features (filters! colors! cropping tools! Instagram before Instagram existed!) and just dump the damn data on the hard drive without crashing. But it wasn't unusual to have to relaunch the application several times before it would just do its job.
* Speaking about MTP: it was born as a substitute for the mass storage protocol for devices that would benefit from some form of data manipulation as it was being sent through -- or to make DRM easier, according to who you listen to. It was slow in being adopted because mass storage is just so much simpler and easier, but then the Android project picked it up to solve its age-old problem with separate filesystems. [[note]]Being based on Linux the main system is on an ext3 or ext4 partition, but Windows can't read that, so if you want your documents visible when you plug in your phone in anything other than a Linux PC you have to put the user data in a FAT32 partition. And you can't use a FAT32 partition for everything because it doesn't allow the permissions that Linux needs in its root filesystem. Hence split partitions, with all the headaches it gave to users who couldn't understand why they still had 12 gigabytes free for their data but had run out of space for apps.[[/note]] With MTP all the data goes through the device ''and'' the computer's processor (plus various other technical drawbacks); this makes it filesystem-independent and means a Windows computer can access the whole of an Android device's memory, but it has a ''heavy'' cost in terms of speed -- a large backup operation can take hours when it'd take tens of minutes on the earlier mass storage protocol. Not to mention that the Windows Explorer is, for no clear reason, ''extremely'' impractical and ungainly in handling MTP folders.
* Some questions to Facebook support ask why it's possible to drag and drop entries in the TV Shows and Movies categories, but not other categories like Apps and Games or Music. There is no official answer. In previous versions of the Likes pages, drag and drop was possible in all categories, but this has remained disabled for more than a year. Enabling this should take the web programmers no more than ten minutes.
* Many websites are distressingly bad at password security, [[https://www.youtube.com/watch?v=8ZtInClXe1Q as Tom Scott explains]]. The worst offenders will simply email a password to you in cleartext on request, which implies that the website is not hashing and salting user passwords, allowing password thieves to crack the stupid passwords like "12345" to log in. Scott claims that handling passwords at all is Idiot Programming to begin with, and that programmers should use a third-party service like Website/{{Facebook}} if possible, because security is so hard to get right.
* UsefulNotes/{{Steam}}
** A misconfiguration in Steam's caching on Christmas Day 2015 let logged-in users briefly see other people's accounts. Caching pages for logged-in users is another big security no-no.
** In February 2017, an exploit was discovered where the titles of user-made guides on one's profile page would be executed as [=JavaScript=] code, meaning anybody who visited a profile page could potentially be redirected to a fake phishing website, asked to download malware, or have marketplace transactions made with their account. Since Steam normally displays a warning page before linking users to external sites, users were more likely to fall for phishing and malware downloads. Protecting one's website from things like code injection is very important for security, so how a large company like Valve let something like this slip through is baffling.
* Website/GOGDotCom's "Galaxy" client has certain issues with download size (namely, getting it wrong) and download speed (roughly comparable to having the code yelled at you over the phone and programming it yourself). For example, take the Diamond edition of ''VideoGame/NeverwinterNights''. This comes to about 2.5GB, a maximum of 3 if you count all the music and avatars and so on that come as freebies. Galaxy will report the game size as being 5GB. It will then take an obscenely long time to download that 5GB; on a connection where Origin can get a game of that size downloaded within an eight-hour period, Galaxy will take twelve. Read that again: [[InsaneTrollLogic 12 hours to download all five gigabytes of a three-gigabyte game]]. Thankfully, Galaxy is completely optional and you can just download games directly from your account on the site, saving time, frustration and bandwidth.
* Many mobile apps don't allow the user to exit them under certain conditions, forcing mobile users to reboot or shut down their devices. The mobile version of the newsfeed aggregator Feedly, for example, doesn't allow you to exit the app until the Internet connection is back, and doesn't allow you to either reload the app in order to check if there is truly an Internet connection (and there are cases where even with Internet connection, it still outputs the dreaded "Are you connected to the Internet?" error).
* Users of the vastly popular Logitech C920 webcam are likely to have installed the Logitech Webcam Software application with the drivers. Any user should check their temp AppData folder for a file named "LWSDebugOut.txt." It appears that roughly every '''second''' the software detects that the webcam is disconnected, it feels the need to make a new entry in the error log. This can cause the file to swell to over 500 MB in only two months.
* For some reason, video player VLC likes to "rebuild its font cache" every time you open it up and try to play a video that has subtitles (God knows why it tries to do this, especially if the video you're trying to watch uses the same font for its subtitles as ones you watched in the past). It would just be a minor annoyance if it didn't take several minutes and have a tendency to ''cause the entire program to become unresponsive'' frequently. For a while, switching to a dummy font renderer or disabling it completely would fix this problem, but it was removed in an update around early 2016, meaning there is ''no way'' to avoid the frustration of having font rendering constantly crash your VLC every time you want to watch a video.
* [[https://www.f-secure.com/v-descs/pikachu.shtml Email-Worm.Win32.Pikachu]] is a {{Franchise/Pokemon}} themed e-mail worm that circulated around the year 2000. It would've been very destructive too, if it weren't for one major flaw. What the worm was intended to do was completely wipe the hardrive upon the next start up. What does it do instead? Every time you boot up your PC, it gives you a "yes or no" prompt flat out asking you if you want it wiped.
** [=Virus.MSWord.Odious =] [[https://youtu.be/f1EUHNQa3EA had the same problem]].

[[folder:Programming Languages - It's All Greek to Me]]
It's times like this that make you wonder if you ought to stick to programming a VCR than deal with source-code debugging.
* In a typical program in modern C++, 40-50% of the syntax will consist of calls on templates designed to shore up the language with modern features. [[AlternateCharacterInterpretation Although]] [[{{Fandom}} C++ proponents]] argue that [[SubvertedTrope the fact that it's possible to hack in such features at all]], [[ZigZaggingTrope albeit with difficulty]], is evidence of the language's expressiveness and power.[[labelnote:*]]And to be fair, C++ templates (a specific implementation of generic programming) are more versatile than C# or Java generics (C++ generics act as compile-time constants, and can be pretty much anything; conversely, Java generics actually disguise casting the object to and from '''Object''', C# generics are limited to using code that works with '''Object''' or that the programmer can guarantee will be there beforehand [generally by making sure that the objects are members of a specific interface], which doesn't include standard operators like '''+''' on generic types [unless you use an interface that defines ''operator +'', or whichever operator you want], and neither language allows you to specify anything other than a type in generics), but they're also more awkward (partially because when writing the templated code's implementation, you have to specify what each template parameter is [for example, '''std::array''', a container for a C-style array designed to make it easier to write code that allows interchangeability with other containers (such as vectors or lists), can be instantiated with something like ''std::array'' (to create an ''int[5]'' array), but the implementation is actually for ''std::array''], templates can be nested [such as ''std::array, 3>'', which creates a ''std::vector[3]'' array], and their syntax is just kinda awkward in general), and nothing is guaranteed (unlike C# and Java, C++ has no base '''std::object''' class that all other classes are implicitly child classes of, so no code is guaranteed to work properly on any given object, unless the programmer explicitly makes sure it will).[[/labelnote]] What most people who have to use C++ will agree on, however, is that libraries that provide modern features in C++ like the [[http://www.boost.org/ Boost C++ Library]] (and even the [[AscendedFanon C++ standard library]], as of C++11 and C++14)[[labelnote:*]]Such as standardised threading and thread-local variable support, foreach (range-based for) loops, simple and efficient built-in safe boolean conversions that can't be used to inadvertently convert an object into a number, the aforementioned '''std::array''' container, and lambda functions, to name but five examples[[/labelnote]], count as [[SugarWiki/GeniusProgramming the other trope]].
* Earlier versions of Java would install updates not by patching the existing installation, but creating a completely new installation in a separate folder ''without removing the old one''.
** This could be justifiable in a sense -- keeping the old versions serves as a crude form of backwards compatibility, ensuring that older code would be able to find the version of Java it was meant to use. If the install was small enough (not that Java is known for its brevity), it would be somewhat practical, though inelegant in the extreme and therefore a Bad Thing.
** Even more scary is the constant discovery of security vulnerabilities that allow malicious software to escape from the touted "sandbox mode" and harm Windows. Some viruses will search for broken versions of Java installed on your computer, in parallel to new ones, then proceed to use that to attempt administrator impersonation.
** While we're on Java... it was intended to be a platform-independent programming language, somewhere in-between a compiled and interpreted language. In practice, it worked okay on the processors used in Sun computers (mostly Motorola) and Intel-and-compatible processors (a market too large for Sun to ignore). On other architectures, it tended to be horrendously inefficient and dog-slow.
** Lately (Java 1.7 and higher) there's been a distressing tendency for each new Java "update" to be less efficient in memory usage, slower, and more draconian about its security policies, which ''the user cannot disable'' (there are various security options, but even the most permissive settings merely tones it down slightly). Instead of Security-Through-Obscurity, Java seems to be aiming for Security-Through-No-One-Can-Stand-To-Use-It-Anymore.
** A rather... interesting occurrence happens when certain programs written for 32-bit Java get run on a system with the 64-bit Java Runtime Environment installed. Namely, ''they don't work.'' This is because 32-bit Java has certain features which the 64-bit Java compiler compiles into streamlined settings, but if those features are specifically called rather than used "as intended" it can cause anything from lag to memory leaks to random crashes or even just not starting at all. The workaround is to install both the 32-bit and 64-bit [=JREs=] into different directories... but your system will only ever update one of them, so you now have to uninstall Java and reinstall it with every update you choose.
* The phenomenon of "ceremony" where large amounts of boilerplate code must be written even for relatively simple functions. In Java, for example, a class holding two values that can be read or written requires at least 4 lines of which 2 are ceremony, or if good OOP practice is used, 10 lines of which 8 are ceremony. Some languages, such as Factor and Common Lisp, attempt to eliminate ceremony entirely, but are often complex to use and potentially hard to understand as a result.
* Nullable types, the bane of many programmers. In older pointer- or object-based languages, any time a programmer declares that a function or procedure inputs a complex value, it can be passed a "null" value which represents an absence of any input and will bring the program to a crashing (literally) halt if anything is done with it. This means that the programmer must manually test every input or every subprogram to check if it is null. Some modern languages such as Swift and Oxygene are starting to rectify this, but it still exists firmly in Java, C++, and C# -- the most commonly used languages (although some development systems are implementing design contracts allowing a program to guarantee that a value will not be null, and warning the programmer when an input value does not provide this guarantee).
** Nullable-by-default is so bad that C.A.R. Hoare, who suggested allowing it in an early programming language so as to allow the creation of cyclic data structures, later called it a "Billion-Dollar Mistake."
** C++ introduced references, which cannot hold a value of null (and have a few other restrictions) and are recommended for anything that doesn't need the low-level features of pointers.
* Exceptions were originally a nifty way of allowing a function to report that something had gone wrong. Unfortunately, they have a major problem: if a function calls other functions that call other functions, it is possible that the top level caller has to handle so many potential exceptions that it cannot possibly deal with them all. This is a common cause of "an unknown error occurred" or "there was a problem" type error messages. Common Lisp attempted to work around this by offering ''restarts'' which allow the function to offer a method of fixing the problem as well as a notification that it happened, but these are complicated to use and often not that beneficial.
** Worse, there are two common bad practices for "handling" exceptions. One is to "swallow" any exception that the section of the program doesn't know how to handle [[note]]Proper methodology is to "throw" the exception again so that a process further up knows that there is a problem and can try to handle it[[/note]]. Another is to repackage the exception (generally in a common format used for the company in question) and to throw out the information in the original Exception. Both practices make it difficult for calling classes to properly handle errors due to lack of information.
* The process of creating a GUI. On most operating systems, the programmer's method for creating a GUI is to build a big complicated data structure describing what's supposed to be in the window, then hand control over to the OS and wait for clicks to be reported. Very few languages have elegant ways of building structures of that type, leading to incredibly messy code. Some platforms tried to help by providing a way to create the data structures in question separately from the program… but eventually the program has to access them, and at least one such platform requires the programmer to write code to manually unpack the data structure again in order to get a reference.
* Software bloat is major issue today where programs get loaded up with more and more features without regard to the difficulty of preventing security exploits with such a thick "book" of code.\\
As an example, the [[https://en.wikipedia.org/wiki/Heartbleed Heartbleed exploit]] was an encryption flaw in a program called [=OpenSSL=] allowing an attacker to send requests to read data that should not be read, allowing the theft of confidential information from websites. Theo de Raadt, founder and leader of the [=OpenSSH=] and [=OpenBSD=] projects (the latter of which strongly prioritizes security in a default OS installation, as noted on the project's website), criticized the team for writing their own memory-management routines and bypassing the exploit counter-measures put in place by [=OpenBSD=]'s C library. The [=OpenSSL=] project being severely underfunded with only two people to write, maintain, and test 500,000 lines of business-critical code was not helpful either. The [=LibreSSL=] project, which is a fork of [=OpenSSL=], was founded by the [=OpenBSD=] project partly to trim down the code bloat, and to make the software suite easier to harden against exploits. Google also ended up making its own fork of [=OpenSSL=], called [=BoringSSL=], which works with the [=LibreSSL=] project.
* [[https://en.wikipedia.org/wiki/Esoteric_programming_language Esoteric Programming Languages]] turn this into an art form, with a helping of StylisticSuck. How about a programming language that uses nothing but whitespace characters? Or a language whose code is written in bitmaps that look like abstract art?

[[folder:Threat Prevention Prevention Software]]
It's never a good thing when your antivirus software is being just as annoying (if not more so) than an actual virus infection.
* Thousands of computers using [=McAfee=] Antivirus were brought down on April 21, 2010 by an update that ''misidentified an essential Windows file as a virus'', causing computers to constantly reboot. Barring exploits your average user wouldn't be familiar with in the slightest, users would have to go to an unaffected computer to get the fix and install it manually, as they couldn't go online on the infected PC.
** [=McAfee's=] strength is that it blocks everything that might be a threat...which is also its main weakness. If you wish to use a program that it considers a threat (and as of this writing, it considers ''VideoGame/DhuxsScar'' and ''VideoGame/SonicMania'', among other things, to be such programs), you cannot get it to grant an exception. You're supposed to send [=McAfee's=] developers an email telling them it's a false alarm. If they don't respond, you need to disable [=McAfee=] every time you want to use the program.
** On many computers, [=McAfee=] will make the CD drive stop working. And for a while, [=McAfee=] would often be stealthily installed by default during installations of something completely unrelated (and still is--looking at you, Adobe Flash Player).
** [=McAfee=] is also easily thwarted by memory-resident viruses, the DOS version of [=ViruScan=] fails to detect the [=AntiCMOS=] virus[[note]]One of the few particularly nasty viruses that attempts to delete a PC's BIOS flash memory as its payload. This is one of two reasons some motherboards of that era have several ways of reflashing the system despite having a hosed BIOS – the other being the user hosing the BIOS during an upgrade[[/note]] back in the days of MS-DOS if the system was booted from an ''already'' infected disk in the first place, even if it has an up-to-date signature file. Comparatively, other antiviruses would have detected the virus in memory, froze the PC, and instructed the user to reboot and run said antivirus from a "rescue" floppy to proceed.
* On April 15th, 2013, Malwarebytes had a catastrophic false positive error, which caused it to mark every DLL, media file, and EXE as a trojan downloader and quarantine it. This ate up many users' hard drives, and rendered hundreds of machines inoperable, causing a large number of files to be lost forever.
* Norton's notorious for this sort of thing:
** Norton Internet Security blocks any and all images with certain dimensions, specifically those that are commonly used for advertisements. Problem is, at least one of the sizes is also commonly used by sites for non-ad purposes. In older versions, this ''could not be turned off'' without disabling all filtering completely.
** Some older versions of Norton products, particularly Norton [=SystemWorks=] and Norton Internet Security, cannot be uninstalled without risking damage to the computer -- [=PCs=] would wind up crashed, bricked or with corrupted files on the hard drive (including ''Windows Registry''). Symantec had to create a special program for the sole purpose of safely and cleanly uninstalling Norton products, dubbed the Norton Removal Tool.
** It is not unheard of for Norton Antivirus to ''declare itself a virus, and attempt to delete itself''.
** Norton 360 in particular...
*** ...will block and delete any less-than-common executable run by the user. This includes coding written by the user themselves.
*** ...deletes DLL files at random. The screen does not let you override this. The "Learn More" button directs to a Japanese version of the Norton product page.
*** ...disables (as of October 2012) ALL network access, including offline, even when the firewall is listed as disabled in the options. The only way to address this is by removing Norton, and this must be done with the Norton Removal Tool in order to reverse the damage done to networking components.
*** ...its firewall disables Firefox. For no reasons at all.
*** ...increases the time needed to boot Windows by more than 500%.
** Norton has also fallen prey to a host of other problems, such as a rather frivolous firewall and bad updates that at best gave [=BSODs=] when simply inserting an USB key and at worst forced users to perform a system restore.
** Norton Antivirus's uninstaller also often accidentally deletes DLL files that are used by other software and drivers. One particular uninstall instance caused a system to BSOD and lose the ability to play sound upon reboot because said DLL was also being used by the Creative [=SoundBlaster=] Live! Drivers. One had to reinstall said drivers to fix the issue and restore the ability to play sound to the computer.
* Symantec Antivirus tends to interfere with right-click menus and, more glaringly, ''cause [=BSODs=] within 5 minutes of starting up your computer'' more often than it actually blocks threats to the operating system. One could say that ''Symantec'' is a threat to your operating system.
* Sophos Antivirus, some time in late 2012, released a virus database update which blocked files with certain extensions, but only when they were auto-downloaded by programs without user prompts. This was supposedly in response to a series of malware programs that would do just that in an attempt to remotely wrest control of Windows computers from their owners. However, one of the files that met the above criteria was the antivirus' ''auto-updater''. Any system with that update would grind to a halt during startup. The patch (which wasn't released for ''nearly a month'') had to be downloaded from a disk or drive while the computer was in safe mode, and in office networks, it could only be done by somebody with top-tier administration privileges. This was because the software, once disabled, would automatically reactivate (which isn't a bad feature on its own, but here, it exacerbated existing problems).
* Sadly, AVG is getting in on the same problems as Norton.
** As discussed above, the AVG Toolbar slows down your browser by absurd amounts. Its core process vprot.exe also eats up a tremendous amount of CPU time (on laptops it will often hog a whole core) even when the browser is idle or occasionally ''not open at all''.
** Not bad enough? In January 2012, the toolbar was bugged to continuously add onto its log file. This would quickly reach upwards of ''40 gigabytes in length'' of nothing but acknowledging calls to arcane functions. Any user who's been using AVG around that era is advised to check WINDOWS/Temp/toolbar_log.txt to see if the bloated file is still there, because the toolbar prevents most disk cleanup tools from deleting it.
** Or how about the 2010 update that rendered systems unbootable (and thus inoperable) by [[http://www.theinquirer.net/inquirer/news/1929949/avg-update-bricks-bit-windows-systems mistaking a critical file in 64-bit versions of Windows 7 for malicious coding?]]
** Like Norton, it's also starting to randomly identify almost any file it's never seen before as a Trojan Horse. This mostly includes executables you developed yourself, arbitrary setup.exe's in your Downloads folder, and occasionally key files of development tools such as the Irvine Assembly libraries.
** In the early days of 64-bit consumer Windows, AVG would detect Windows XP Professional x64 Edition as Windows Server 2003 R2 (the former is based on the latter), and instead of installing, complains about the software not being licensed to run on Server [=OSes=] and pushes the user to buy AVG antiviruses for servers instead.
* Avast!:
** Avast! actually made a blunder back in 2012 which causes the antivirus to freeze Windows XP Professional x64 Edition machines, and ''only'' XP Professional x64 Edition machines. The 64-bit versions of Vista and 7 were unaffected. The biggest kick in the nuts is, to restore the machine to usable state, one has to reboot into safe mode, which renders the uninstaller unusable because safe mode also disables the Windows Installer services[[labelnote:*]]It is possible to enable Windows Installer services in safe mode, but this requires a registry hack that most users are unaware of.[[/labelnote]]. One has to somehow get the Avast remover program from Avast's website and move it to said affected PC and run it in safe mode to be able to remove Avast and return the PC to a usable state.
** Like AVG and Norton, Avast is now starting to block ''and quarantine'' programs that you've compiled yourself.
** Avast! may get defensive when you try to install the Razer Synapse mouse software for your Razer mice. Simply plugging in a Death Adder will cause Windows 10 to start retrieving the needed drivers, possibly triggering a threat detection.
** [[https://forum.avast.com/index.php?topic=197572.135 Avast! virus definitions 170221-1 22.2.2017]] caused many files to be detected as VBS:Malware:Gen. If a user panics and delete all of those files and a critical system file(s) was removed too, then the system would likely be inoperable. Randomly deleting your files: Isn't that malware's job?
* Panda Antivirus has [[http://www.theregister.co.uk/2015/03/11/panda_antivirus_update_self_pwn/ joined the ranks]] of security software that flags itself as a threat, tries to intervene, and borks the system as a result.
* Windows Defender, prior to the Windows 10 Creators Update, had an.. interesting.. way of determining if a [=JavaScript=] program was dangerous: run it. Run it ''inside Windows Defender'', which runs as a System Service and has permission to change anything on the system. Did we mention it does this whenever a file containing such a script is ''accessed'', even if the user does nothing that would run the script?