Follow TV Tropes

Following

History DarthWiki / IdiotDesign

Go To

OR

Is there an issue? Send a MessageReason:
None


** There's a character supposedly called "face with a look of triumph" which, on Windows and Android at least (plus [[https://www.fileformat.info/info/unicode/char/1f624/index.htm this character database]]), actually represent an angry face steaming from the nose (😤).

to:

** There's a character supposedly called "face with a look of triumph" which, on Windows and Android at least (plus [[https://www.fileformat.info/info/unicode/char/1f624/index.htm this character database]]), actually represent represents an angry face steaming from the nose (😤).
Is there an issue? Send a MessageReason:
None


*** Things were turned up to eleven by the Atari Jaguar CD add-on. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the already low-quality disc motor to break apart internally from its fruitless attempts to spin the disc. Even inserting the Memory Track cartridge wrong can cause the main Jaguar unit to return a connection error, due to cartridges in the main console taking boot priority over [=CDs=] in the addon, so if the Memory Track cartridge isn't inserted properly it's simply read as a regular game cartridge that isn't inserted properly and cause a failure to boot. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective. And remember, this was an add-on for a console that only sold 125,000 units, making its very existence an example of idiot design. Only 25,000 were produced and nowhere near that amount were sold or even shipped (and only 13 games were ever released). Due to scarcity, Jaguar [=CDs=] go for ungodly amounts of money on auction sites, and due to the generally poor design, you're unlikely to get a working unit.

to:

*** Things were turned up to eleven by the Atari Jaguar CD add-on. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the already low-quality disc motor to break apart internally from its fruitless attempts to spin the disc. Even inserting the Memory Track cartridge wrong can cause the main Jaguar unit to return a connection error, due to cartridges in the main console taking boot priority over [=CDs=] in the addon, so if the Memory Track cartridge isn't inserted properly it's simply read as a regular game cartridge that isn't inserted properly and cause a failure to boot. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective. And remember, this was an add-on for a console that only sold 125,000 units, making its very existence an example of idiot design. Only 25,000 were produced and nowhere near that amount were sold or even shipped (and only 13 games were ever released). Due to scarcity, Jaguar [=CDs=] go for ungodly amounts of money on auction sites, and due to the generally poor design, you're unlikely more likely to end up TrappedInAnotherWorld than you are to get a working unit.
Is there an issue? Send a MessageReason:
None


*** Things were turned up to eleven by the Atari Jaguar CD add-on. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the already low-quality disc motor to break apart internally from its fruitless attempts to spin the disc. Even inserting the Memory Track cartridge wrong can cause the main Jaguar unit to return a connection error, due to cartridges in the main console taking boot priority over [=CDs=] in the addon, so if the Memory Track cartridge isn't inserted properly it's simply read as a regular game cartridge that isn't inserted properly and cause a failure to boot. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective. And remember, this was an add-on for a console that only sold 125,000 units, making its very existence an example of idiot design. Only 25,000 were produced and nowhere near that amount were sold or even shipped (and only 13 games were ever released). Due to scarcity working units, Jaguar [=CDs=] go for ungodly amounts of money on auction sites.

to:

*** Things were turned up to eleven by the Atari Jaguar CD add-on. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the already low-quality disc motor to break apart internally from its fruitless attempts to spin the disc. Even inserting the Memory Track cartridge wrong can cause the main Jaguar unit to return a connection error, due to cartridges in the main console taking boot priority over [=CDs=] in the addon, so if the Memory Track cartridge isn't inserted properly it's simply read as a regular game cartridge that isn't inserted properly and cause a failure to boot. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective. And remember, this was an add-on for a console that only sold 125,000 units, making its very existence an example of idiot design. Only 25,000 were produced and nowhere near that amount were sold or even shipped (and only 13 games were ever released). Due to scarcity working units, scarcity, Jaguar [=CDs=] go for ungodly amounts of money on auction sites.sites, and due to the generally poor design, you're unlikely to get a working unit.
Is there an issue? Send a MessageReason:
None


*** Things were turned up to eleven by the Atari Jaguar CD add-on. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the already low-quality disc motor to break apart internally from its fruitless attempts to spin the disc. Even inserting the Memory Track cartridge wrong can cause the main Jaguar unit to return a connection error, due to cartridges in the main console taking boot priority over [=CDs=] in the addon, so if the Memory Track cartridge isn't inserted properly it's simply read as a regular game cartridge that isn't inserted properly and cause a failure to boot. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective. And remember, this was an add-on for a console that only sold 125,000 units, making its very existence an example of idiot design. Only 25,000 were produced and nowhere near that amount were sold or even shipped (and only 13 games were ever released), due to scarcity working Jaguar [=CDs=] go for ungodly amounts of money on auction sites.

to:

*** Things were turned up to eleven by the Atari Jaguar CD add-on. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the already low-quality disc motor to break apart internally from its fruitless attempts to spin the disc. Even inserting the Memory Track cartridge wrong can cause the main Jaguar unit to return a connection error, due to cartridges in the main console taking boot priority over [=CDs=] in the addon, so if the Memory Track cartridge isn't inserted properly it's simply read as a regular game cartridge that isn't inserted properly and cause a failure to boot. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective. And remember, this was an add-on for a console that only sold 125,000 units, making its very existence an example of idiot design. Only 25,000 were produced and nowhere near that amount were sold or even shipped (and only 13 games were ever released), due released). Due to scarcity working units, Jaguar [=CDs=] go for ungodly amounts of money on auction sites.
Is there an issue? Send a MessageReason:
None


*** Things were turned up to eleven by the Atari Jaguar CD add-on. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the already low-quality disc motor to break apart internally from its fruitless attempts to spin the disc. Even inserting the Memory Track cartridge wrong can cause the main Jaguar unit to return a connection error, due to cartridges in the main console taking boot priority over [=CDs=] in the addon, so if the Memory Track cartridge isn't inserted properly it's simply read as a regular game cartridge that isn't inserted properly and cause a failure to boot. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective.

to:

*** Things were turned up to eleven by the Atari Jaguar CD add-on. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the already low-quality disc motor to break apart internally from its fruitless attempts to spin the disc. Even inserting the Memory Track cartridge wrong can cause the main Jaguar unit to return a connection error, due to cartridges in the main console taking boot priority over [=CDs=] in the addon, so if the Memory Track cartridge isn't inserted properly it's simply read as a regular game cartridge that isn't inserted properly and cause a failure to boot. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective. And remember, this was an add-on for a console that only sold 125,000 units, making its very existence an example of idiot design. Only 25,000 were produced and nowhere near that amount were sold or even shipped (and only 13 games were ever released), due to scarcity working Jaguar [=CDs=] go for ungodly amounts of money on auction sites.
Is there an issue? Send a MessageReason:
None


* The UsefulNotes/SuperNintendoEntertainmentSystem was an overall improvement over the NES, doing away with the front-loading cartridge slot altogether and using better-quality pin connectors. That being said, there's still one notable flaw in the system in that the power plug on the back of the system is prone to breaking. Whereas the inner barrel of the power plug on the NES is made of metal, the SNES, despite being compatible with the NES's power cable, uses a plastic barrel instead. This results in a more fragile power port in which the inner barrel breaks off from the stress of repeatedly plugging and unplugging the power cord from the system end.

to:

* The UsefulNotes/SuperNintendoEntertainmentSystem was an overall improvement over the NES, doing away with the front-loading cartridge slot altogether and using better-quality pin connectors. That being said, there's still one notable flaw in the system in that the power plug on the back of the system is prone to breaking. Whereas the inner barrel of the power plug on the NES is made of metal, the SNES, despite being compatible with on the NES's power cable, other hand, uses a plastic barrel instead. This results in a more fragile power port in which the inner barrel breaks off from the stress of repeatedly plugging and unplugging the power cord from the system end.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* The UsefulNotes/SuperNintendoEntertainmentSystem was an overall improvement over the NES, doing away with the front-loading cartridge slot altogether and using better-quality pin connectors. That being said, there's still one notable flaw in the system in that the power plug on the back of the system is prone to breaking. Whereas the inner barrel of the power plug on the NES is made of metal, the SNES, despite being compatible with the NES's power cable, uses a plastic barrel instead. This results in a more fragile power port in which the inner barrel breaks off from the stress of repeatedly plugging and unplugging the power cord from the system end.
Is there an issue? Send a MessageReason:
None


* The [[https://youtu.be/zvLEoxBXcmU Standard Manufacturing S333 Thunderstruck]], a double-barreled .22 Magnum revolver. While Karl Kasarda of InRange was initially intrigued by the concept, during testing he found that the gun had such horrible accuracy that it could barely hit reliably at 10 yards. After much testing and anecdotes from other owners, it was determined that Standard Manufacturing had been boring the rifling of both barrels simultaneously and one of them was badly misaligned, causing one barrel to be fairly accurate but one to send bullets spiraling sideways.

to:

* The [[https://youtu.be/zvLEoxBXcmU Standard Manufacturing S333 Thunderstruck]], a double-barreled .22 Magnum revolver. While Karl Kasarda of InRange [=InRange=] was initially intrigued by the concept, during testing he found that the gun had such horrible accuracy that it could barely hit reliably at 10 yards. After much testing and anecdotes from other owners, it was determined that Standard Manufacturing had been boring the rifling of both barrels simultaneously and one of them was badly misaligned, causing one barrel to be fairly accurate but one to send bullets spiraling sideways.
Is there an issue? Send a MessageReason:
None


** The usage of an optical disc format on the PSP can qualify. On paper, it made perfect sense to choose optical discs over cartridges because of the former's storage size advantages and relative low manufacturing expense, and it enabled the release of high-quality video on the PSP. After all, utilizing optical discs instead of cartridges had propelled the original UsefulNotes/PlayStation to much greater success than [[UsefulNotes/Nintendo64 its competition from Nintendo]], so there was no reason to assume the same thing wouldn't happen again in the handheld market. However, in practice, optical discs quickly proved to be a poor fit for a handheld system: Sony's Universal Media Disc (UMD) was fragile, as there have been many cases of the outer plastic shell which protected the actual data disc cracking and breaking, rendering the disc useless until the user buys an aftermarket replacement shell. In addition, it wasn't uncommon for the PSP's UMD drive to fail due to wear of the moving parts. UMD also failed to catch on as a video format due to their proprietary technology making them more expensive to produce than other optical disc formats, and thus were priced higher than [=DVDs=] despite holding noticeably less data.[[note]]UMD holds at most 1.8 gigabytes on a dual-layer disc; DVD holds at ''least'' 4.7 on a single-layer disc, more than two and a half times that.[[/note]] There were also no devices besides the PSP that could play UMD movies, meaning that you were stuck watching your UMD movies on the PSP's small screen, and couldn't swap them with your friends unless they too had a PSP. This drove away consumers, who would rather purchase a portable DVD player and have access to a cheaper media library, while the more tech-savvy can rip their [=DVDs=] and put them on Memory Sticks to watch on PSP without a separate disc. In addition, [[LoadsAndLoadsOfLoading UMD load times were]] ''[[LoadsAndLoadsOfLoading long]]'', compared to that of a standard UsefulNotes/NintendoDS cartridge, which Sony themselves tried to fix by doubling the memory of the PSP in later models to use as a UMD cache. By 2009, Sony themselves were trying to phase out the UMD with the PSP Go, which did not have a UMD drive and relied on digital downloads from the [=PlayStation=] Store, but it was too late; most games were already released on UMD while very few were actually made available digitally, and so the Go blocked off a major portion of PSP games, which led to consumers ignoring the Go. Sony's decision to use standard cartridges for the PSP's successor, the UsefulNotes/PlayStationVita, seemed like a tacit admission that using UMD on the PSP was a mistake.

to:

** The usage of an optical disc format on the PSP can qualify. On paper, it made perfect sense to choose optical discs over cartridges because of the former's storage size advantages and relative low manufacturing expense, and it enabled the release of high-quality video on the PSP. After all, utilizing optical discs instead of cartridges had propelled the original UsefulNotes/PlayStation to much greater success than [[UsefulNotes/Nintendo64 its competition from Nintendo]], so there was no reason to assume the same thing wouldn't happen again in the handheld market. However, in practice, optical discs quickly proved to be a poor fit for a handheld system: Sony's Universal Media Disc (UMD) was fragile, as there have been many cases of the outer plastic shell which protected the actual data disc cracking and breaking, rendering the disc useless until the user buys an aftermarket replacement shell. In addition, it wasn't uncommon for the PSP's UMD drive to fail due to wear of the moving parts. UMD also failed to catch on as a video format due to their proprietary technology making them more expensive to produce than other optical disc formats, and thus were priced higher than [=DVDs=] despite holding noticeably less data.[[note]]UMD holds at most 1.8 gigabytes on a dual-layer disc; DVD holds at ''least'' 4.7 on a single-layer disc, more than two and a half times that.[[/note]] There were also no devices besides the PSP that could play UMD movies, meaning that you were stuck watching your UMD movies on the PSP's small screen, and couldn't swap them with your friends unless they too had a PSP. This drove away consumers, who would rather purchase a portable DVD player and have access to a cheaper media library, while the more tech-savvy can rip their [=DVDs=] and put them on Memory Sticks to watch on PSP without a separate disc. In addition, [[LoadsAndLoadsOfLoading UMD load times were]] ''[[LoadsAndLoadsOfLoading long]]'', compared to that of a standard UsefulNotes/NintendoDS cartridge, which Sony themselves tried to fix by doubling the memory of the PSP in later models to use as a UMD cache. By 2009, Sony themselves were trying to phase out the UMD with the PSP Go, which did not have a UMD drive and relied on digital downloads from the [=PlayStation=] Store, but it was too late; most games were already released on UMD while very few were actually made available digitally, and so the Go blocked off a major portion of PSP games, games without offering any real advantage to make up for it because the [=PlayStation=] Store was available on ''all'' PSP models at the time, which led to consumers ignoring the Go. Sony's decision to use standard cartridges for the PSP's successor, the UsefulNotes/PlayStationVita, seemed like a tacit admission that using UMD on the PSP was a mistake.
Is there an issue? Send a MessageReason:
None


** The usage of an optical disc format on the PSP can qualify. On paper, it made perfect sense to choose optical discs over cartridges because of the former's storage size advantages and relative low manufacturing expense, and it enabled the release of high-quality video on the PSP. After all, utilizing optical discs instead of cartridges had propelled the original UsefulNotes/PlayStation to much greater success than [[UsefulNotes/Nintendo64 its competition from Nintendo]], so there was no reason to assume the same thing wouldn't happen again in the handheld market. However, in practice, optical discs quickly proved to be a poor fit for a handheld system: Sony's Universal Media Disc (UMD) was fragile, as there have been many cases of the outer plastic shell which protected the actual data disc cracking and breaking, rendering the disc useless until the user buys an aftermarket replacement shell. In addition, it wasn't uncommon for the PSP's UMD drive to fail due to wear of the moving parts. UMD also failed to catch on as a video format due to their proprietary technology making them more expensive to produce than other optical disc formats, and thus were priced higher than [=DVDs=] despite holding a fifth as much data. There were also no devices besides the PSP that could play UMD movies, meaning that you were stuck watching your UMD movies on the PSP's small screen, and couldn't swap them with your friends unless they too had a PSP. This drove away consumers, who would rather purchase a portable DVD player and have access to a cheaper media library, while the more tech-savvy can rip their [=DVDs=] and put them on Memory Sticks to watch on PSP without a separate disc. In addition, [[LoadsAndLoadsOfLoading UMD load times were]] ''[[LoadsAndLoadsOfLoading long]]'', compared to that of a standard UsefulNotes/NintendoDS cartridge, which Sony themselves tried to fix by doubling the memory of the PSP in later models to use as a UMD cache. By 2009, Sony themselves were trying to phase out the UMD with the PSP Go, which did not have a UMD drive and relied on digital downloads from the [=PlayStation=] Store, but it was too late; most games were already released on UMD while very few were actually made available digitally, and so the Go blocked off a major portion of PSP games, which led to consumers ignoring the Go. Sony's decision to use standard cartridges for the PSP's successor, the UsefulNotes/PlayStationVita, seemed like a tacit admission that using UMD on the PSP was a mistake.

to:

** The usage of an optical disc format on the PSP can qualify. On paper, it made perfect sense to choose optical discs over cartridges because of the former's storage size advantages and relative low manufacturing expense, and it enabled the release of high-quality video on the PSP. After all, utilizing optical discs instead of cartridges had propelled the original UsefulNotes/PlayStation to much greater success than [[UsefulNotes/Nintendo64 its competition from Nintendo]], so there was no reason to assume the same thing wouldn't happen again in the handheld market. However, in practice, optical discs quickly proved to be a poor fit for a handheld system: Sony's Universal Media Disc (UMD) was fragile, as there have been many cases of the outer plastic shell which protected the actual data disc cracking and breaking, rendering the disc useless until the user buys an aftermarket replacement shell. In addition, it wasn't uncommon for the PSP's UMD drive to fail due to wear of the moving parts. UMD also failed to catch on as a video format due to their proprietary technology making them more expensive to produce than other optical disc formats, and thus were priced higher than [=DVDs=] despite holding noticeably less data.[[note]]UMD holds at most 1.8 gigabytes on a fifth as much data. dual-layer disc; DVD holds at ''least'' 4.7 on a single-layer disc, more than two and a half times that.[[/note]] There were also no devices besides the PSP that could play UMD movies, meaning that you were stuck watching your UMD movies on the PSP's small screen, and couldn't swap them with your friends unless they too had a PSP. This drove away consumers, who would rather purchase a portable DVD player and have access to a cheaper media library, while the more tech-savvy can rip their [=DVDs=] and put them on Memory Sticks to watch on PSP without a separate disc. In addition, [[LoadsAndLoadsOfLoading UMD load times were]] ''[[LoadsAndLoadsOfLoading long]]'', compared to that of a standard UsefulNotes/NintendoDS cartridge, which Sony themselves tried to fix by doubling the memory of the PSP in later models to use as a UMD cache. By 2009, Sony themselves were trying to phase out the UMD with the PSP Go, which did not have a UMD drive and relied on digital downloads from the [=PlayStation=] Store, but it was too late; most games were already released on UMD while very few were actually made available digitally, and so the Go blocked off a major portion of PSP games, which led to consumers ignoring the Go. Sony's decision to use standard cartridges for the PSP's successor, the UsefulNotes/PlayStationVita, seemed like a tacit admission that using UMD on the PSP was a mistake.
Is there an issue? Send a MessageReason:
None


** The usage of an optical disc format on the PSP can qualify. On paper, it made perfect sense to choose optical discs over cartridges because of the former's storage size advantages and relative low manufacturing expense, and it enabled the release of high-quality video on the PSP. After all, utilizing optical discs instead of cartridges had propelled the original UsefulNotes/PlayStation to much greater success than [[UsefulNotes/Nintendo64 its competition from Nintendo]], so there was no reason to assume the same thing wouldn't happen again in the handheld market. However, in practice, optical discs quickly proved to be a poor fit for a handheld system: Sony's Universal Media Disc (UMD) was fragile, as there have been many cases of the outer plastic shell which protected the actual data disc cracking and breaking, rendering the disc useless until the user buys an aftermarket replacement shell. In addition, it wasn't uncommon for the PSP's UMD drive to fail due to wear of the moving parts. UMD also failed to catch on as a video format due to their proprietary technology making them more expensive to produce than other optical disc formats, and thus were priced higher than [=DVDs=]. There were also no devices besides the PSP that could play UMD movies, meaning that you were stuck watching your UMD movies on the PSP's small screen, and couldn't swap them with your friends unless they too had a PSP. This drove away consumers, who would rather purchase a portable DVD player and have access to a cheaper media library, while the more tech-savvy can rip their [=DVDs=] and put them on Memory Sticks to watch on PSP without a separate disc. In addition, [[LoadsAndLoadsOfLoading UMD load times were]] ''[[LoadsAndLoadsOfLoading long]]'', compared to that of a standard UsefulNotes/NintendoDS cartridge, which Sony themselves tried to fix by doubling the memory of the PSP in later models to use as a UMD cache. By 2009, Sony themselves were trying to phase out the UMD with the PSP Go, which did not have a UMD drive and relied on digital downloads from the [=PlayStation=] Store, but it was too late; most games were already released on UMD while very few were actually made available digitally, and so the Go blocked off a major portion of PSP games, which led to consumers ignoring the Go. Sony's desigion to use standard cartridges for the PSP's successor, the UsefulNotes/PlayStationVita, seemed like a tacit admission that using UMD on the PSP was a mistake.
** Like the Xbox 360, the [=PlayStation=] 3 suffered from some growing pains. It also used the same lead-free solder which was prone to breakage, but while Sony designed the [=PS3=] for quiet operation and overall had a better cooling system than the Xbox 360, there was one major problem: that Sony had used low-quality thermal paste for the Cell and RSX processors that were prone to drying up, and they dried up quickly. The result? [=PS3s=] would begin running loud mere ''months'' after being built, and since the processors were no longer making proper contact with the heatsinks, this made them prone to overheating, shortening the chips' lifespan significantly, especially the RSX, and potentially disrupting connections between the chips and the motherboard due to extreme heat. Worse is that Sony connected the chip dies to their integrated heat spreaders with that same thermal paste instead of solder, requiring a delidding of the chips to fully correct the problem, which could potentially brick the system if not performed correctly. But that wasn't all: Sony also used NEC/TOLKIN capacitors in the phat model [=PS3s=], which while less expensive and more compact than traditional capacitors, they also turned out to be less reliable and were prone to failure, especially under the excessive heat created by the thermal paste drying out, or under the stress of running more demanding titles from late in the system's life like ''VideoGame/GranTurismo 6'' or ''VideoGame/TheLastOfUs''. Sony corrected these problems with the Slim and Super Slim models.

to:

** The usage of an optical disc format on the PSP can qualify. On paper, it made perfect sense to choose optical discs over cartridges because of the former's storage size advantages and relative low manufacturing expense, and it enabled the release of high-quality video on the PSP. After all, utilizing optical discs instead of cartridges had propelled the original UsefulNotes/PlayStation to much greater success than [[UsefulNotes/Nintendo64 its competition from Nintendo]], so there was no reason to assume the same thing wouldn't happen again in the handheld market. However, in practice, optical discs quickly proved to be a poor fit for a handheld system: Sony's Universal Media Disc (UMD) was fragile, as there have been many cases of the outer plastic shell which protected the actual data disc cracking and breaking, rendering the disc useless until the user buys an aftermarket replacement shell. In addition, it wasn't uncommon for the PSP's UMD drive to fail due to wear of the moving parts. UMD also failed to catch on as a video format due to their proprietary technology making them more expensive to produce than other optical disc formats, and thus were priced higher than [=DVDs=].[=DVDs=] despite holding a fifth as much data. There were also no devices besides the PSP that could play UMD movies, meaning that you were stuck watching your UMD movies on the PSP's small screen, and couldn't swap them with your friends unless they too had a PSP. This drove away consumers, who would rather purchase a portable DVD player and have access to a cheaper media library, while the more tech-savvy can rip their [=DVDs=] and put them on Memory Sticks to watch on PSP without a separate disc. In addition, [[LoadsAndLoadsOfLoading UMD load times were]] ''[[LoadsAndLoadsOfLoading long]]'', compared to that of a standard UsefulNotes/NintendoDS cartridge, which Sony themselves tried to fix by doubling the memory of the PSP in later models to use as a UMD cache. By 2009, Sony themselves were trying to phase out the UMD with the PSP Go, which did not have a UMD drive and relied on digital downloads from the [=PlayStation=] Store, but it was too late; most games were already released on UMD while very few were actually made available digitally, and so the Go blocked off a major portion of PSP games, which led to consumers ignoring the Go. Sony's desigion decision to use standard cartridges for the PSP's successor, the UsefulNotes/PlayStationVita, seemed like a tacit admission that using UMD on the PSP was a mistake.
** Like the Xbox 360, the [=PlayStation=] 3 suffered from some growing pains. It also used the same lead-free solder which was prone to breakage, but while Sony designed the [=PS3=] for quiet operation and overall had a better cooling system than the Xbox 360, there was one major problem: that Sony had used low-quality thermal paste for the Cell and RSX processors that were prone to drying up, and they dried up quickly. The result? [=PS3s=] would begin running loud mere ''months'' after being built, and since the processors were no longer making proper contact with the heatsinks, this made them prone to overheating, shortening the chips' lifespan significantly, especially the RSX, and potentially disrupting connections between the chips and the motherboard due to extreme heat. Worse is that Sony connected the chip dies to their integrated heat spreaders with that same thermal paste instead of solder, requiring a delidding of the chips to fully correct the problem, which could potentially brick the system if not performed correctly. But that wasn't all: Sony also used NEC/TOLKIN capacitors in the phat model [=PS3s=], which which, while less expensive and more compact than traditional capacitors, they also turned out to be less reliable and were prone to failure, especially under the excessive heat created by the thermal paste drying out, or under the stress of running more demanding titles from late in the system's life like ''VideoGame/GranTurismo 6'' or ''VideoGame/TheLastOfUs''. Sony corrected these problems with the Slim and Super Slim models.



* While it is fully in WhatCouldHaveBeen territory, Active Enterprises, the developers of the (in)famous ''Videogame/Action52'', had plans to develop a console, the [[https://bootleggames.fandom.com/wiki/Action_Gamemaster Action Gamemaster]]. Besides its own cartridges, it would have been able to play cartridges of other consoles as well as CD-ROM games besides having a TV tuner. Ergonomics aside, try to imagine the weight and autonomy of such a device with TheNineties technology. And then look at the concept art and realize how bizarrely proportioned everything is, that there's no obvious place to put cartridges, [=CDs=], or required accessories, and how vulnerable that screen is. The phrases "pipe dream" and "ahead of its time" barely even begin to describe this idea, and the fact [[PointyHairedBoss Active Enterprises head Vince Perri]] thought it would be anything other than AwesomeButImpractical just goes to show how overambitious and out of touch with not only the video game market but technology in general the man was.

to:

* While it is fully in WhatCouldHaveBeen territory, Active Enterprises, the developers of the (in)famous ''Videogame/Action52'', had plans to develop a handheld console, the [[https://bootleggames.fandom.com/wiki/Action_Gamemaster Action Gamemaster]]. Besides its own cartridges, it would have been able to play cartridges of other consoles as well as CD-ROM games besides having a TV tuner. Ergonomics aside, try to imagine the weight and autonomy of such a device with TheNineties technology. And then look at the concept art and realize how bizarrely proportioned everything is, that there's no obvious place to put cartridges, [=CDs=], or required accessories, and how vulnerable that screen is. The phrases "pipe dream" and "ahead of its time" barely even begin to describe this idea, and the fact [[PointyHairedBoss Active Enterprises head Vince Perri]] thought it would be anything other than AwesomeButImpractical just goes to show how overambitious and out of touch with not only the video game market but technology in general the man was.
Is there an issue? Send a MessageReason:
That's really Idiot Programming rather than Idiot Design


** Early in the life of the 360, many gamers used the console's optional VGA cable to play their games with HD graphics, as true [=HDTVs=] tended to be rare and expensive back then. PC monitors at the time usually had a 4:3 aspect ratio, which most game engines were smart enough to handle by simply sticking black bars at the top and bottom of the screen, with a few even rendering natively at the right resolution. However, some engines (including the one used for ''VideoGame/NeedForSpeed Most Wanted'' and ''Carbon'') instead rendered the game in 480p - likely the only 4:3 resolution they supported - and upscaled the output. Needless to say, playing a 480p game stretched to a higher resolution [[note]](1280×1024 being the most common resolution for mid-range monitors at the time - which you may notice is actually a 5:4 aspect ratio instead of 4:3, meaning that you had pixel distortion to deal with on top of other things)[[/note]] looked awful, and even worse than just playing it on an SDTV.

Added: 824

Changed: 424

Is there an issue? Send a MessageReason:
None


** Besides the GPS plastic, some translucent plastic variants are also known to break very easily. The Deluxe-class [[Film/{{Transformers}} 2007 movie]] Brawl figure had its inner gear mechanics made out of such plastic, which tended to shatter right at the figure's first transformation. On top of that, the posts that held its arms in place didn't match the shape of the holes they were supposed to peg into. Thankfully, the toy was released some time later in new colors, which fixed all of these issues.

to:

** Besides the GPS plastic, some Some translucent plastic variants are also known to break very easily. easily when stress is placed on them.
***
The first Deluxe-class [[Film/{{Transformers}} 2007 movie]] Brawl figure had its inner gear mechanics made out of such plastic, has a partial auto-transforming gimmick that relies in internal translucent motion plastic gears, which tended tend to shatter right at and make the figure's first transformation. gimmick stop working. On top of that, the posts that held its pegs attaching his arms in place didn't match the to his shoulders are a different shape of than the holes they were they're supposed to peg into. Thankfully, the toy was released some time later in new colors, which fixed all of these issues.issues.
*** Several car Transformers from the ''[[Toys/TransformersWarForCybertron War for Cybertron]]'' toyline have their entire hoods cast in clear plastic to make their windows and windshields see-through, including the hinges used for transformation. The stress placed on said part from playing with the toy as intended frequently leads to hoods and windshields cracking.

Changed: 1381

Removed: 199

Is there an issue? Send a MessageReason:
cruft


** Heat issues are also bad for [=MacBook=] Pros. Not so much for casual users, but very much so for heavy processor load applications. Since the MBP is pretty much ''de rigeur'' for musicians (and almost as much for graphic designers and moviemakers), this is a rather annoying problem since Photoshop with a lot of images or layers or any music software with a large number of tracks ''will'' drive your temperature through the roof. Those who choose to game with a MBP have it even worse - ''VideoGame/WorldOfWarcraft'' will start to cook your MBP within 30 minutes of playing, especially if you have a high room temperature. The solution? Get the free software programs Temperature Monitor and [=SMCFanControl=], then keep an eye on your temps and be very liberal with upping the fans. The only downsides to doing so are more noise, a drop in battery time, and possible fan wear, but that's '''far''' better than your main system components being fried or worn down early.

to:

** Heat issues are also bad for [=MacBook=] Pros. Not so much for casual users, but very much so for heavy processor load applications. Since the MBP is pretty much ''de rigeur'' for musicians (and almost as much for graphic designers and moviemakers), this is a rather annoying problem since Photoshop with a lot of images or layers or any music software with a large number of tracks ''will'' drive your temperature through the roof. Those who choose to game with a MBP have it even worse - ''VideoGame/WorldOfWarcraft'' will start to cook your MBP within 30 minutes of playing, especially if you have a high room temperature. The solution? Get the free software programs Temperature Monitor and [=SMCFanControl=], then keep an eye on your temps and be very liberal with upping the fans. The only downsides to doing so are more noise, a drop in battery time, and possible fan wear, but that's '''far''' better than your main system components being fried or worn down early.



* [[BrokenBase Say what you will about the iPhone 7's merging of the headphone and charger jacks]], but there's no denying that upon closer inspection, this combined with the lack of wireless charging creates a whole new problem. As explained in [[https://www.youtube.com/watch?v=0vYfAqEyGV0 this video]], the charger jack is only capable of withstanding a certain amount of wear and tear (between 5,000 and 10,000 plugs, although only Apple themselves know the exact number). Because you're now using the same jack for two different things, chances are you'll wear it out twice as fast as any other iPhone. Because the phone doesn't have a wireless charging function like most other phones, this means that if this happens your phone is pretty much toast.
* The Apple Magic Mouse 2 tried to improve upon the original Magic Mouse by making its battery rechargeable. Unfortunately, this choice was widely ridiculed as for some reason [[https://static.tvtropes.org/pmwiki/pub/images/zqd1404.png the charging port was located on the underside of the mouse]], rendering it inoperable while plugged in. One wonders why they couldn't just put the port on the front, like pretty much every other chargeable mouse. Apparently, it's to preserve the mouse's aesthetics, in yet another case of Apple [[AwesomeButImpractical favoring aesthetics over usability]].
* Several users have [[https://discussions.apple.com/thread/8223635 reported]] that plugging a charger into their Thunderbolt 3 [=MacBook=] Pro makes the left topmost port output 20V instead of the standard 5V, effectively frying whatever is plugged in. Four ports you could plug a charger in, and they didn't test what happens if you plug the charger in anywhere else?

to:

* [[BrokenBase Say what you will about the iPhone 7's merging of the headphone and charger jacks]], but there's no denying that upon closer inspection, this combined with the lack of wireless charging creates a whole new problem. As explained in [[https://www.youtube.com/watch?v=0vYfAqEyGV0 this video]], the charger jack is only capable of withstanding a certain amount of wear and tear (between 5,000 and 10,000 plugs, although only Apple themselves know the exact number). Because you're now using the same jack for two different things, chances are you'll wear it out twice as fast as any other iPhone. Because the phone doesn't have a wireless charging function like most other phones, this means that if this happens your phone is pretty much toast.
* The Apple Magic Mouse 2 tried to improve upon the original Magic Mouse by making its battery rechargeable. Unfortunately, this choice was widely ridiculed as for some reason [[https://static.tvtropes.org/pmwiki/pub/images/zqd1404.png the charging port was located on the underside of the mouse]], rendering it inoperable while plugged in. One wonders why they couldn't just put the port on the front, like pretty much almost every other chargeable mouse. Apparently, it's to preserve the mouse's aesthetics, in yet another case of Apple [[AwesomeButImpractical favoring aesthetics over usability]].
* Several users have [[https://discussions.apple.com/thread/8223635 reported]] that plugging a charger into their Thunderbolt 3 [=MacBook=] Pro makes the left topmost port output 20V instead of the standard 5V, effectively frying whatever is plugged in. Four ports you could plug a charger in, and they didn't test what happens if you plug the charger in anywhere else?else.



* The "Prescott" core Pentium 4 has a reputation for being the worst CPU design in history. It had some design trade-offs which lessened the processor's performance-per-clock over the original Pentium 4 design, but theoretically allowed the Prescott to run at much higher clockspeeds. Unfortunately, these changes also made the Prescott vastly ''hotter'' than the original design -- something that was admittedly exacerbated than their [=90nm=] manufacturing process actually being worse for power consumption than the previous [=130nm=] process when it came to anything clocked higher than around [=3GHz=] -- making it impossible for Intel to actually achieve the clockspeeds they wanted. Moreover, they totally bottlenecked the processor's performance, meaning that Intel's usual performance-increasing tricks (more cache and faster system buses) did nothing to help. By the time Intel came up with a new processor that put them back in the lead, the once hugely valuable Pentium brand had been rendered utterly worthless by the whole Prescott fiasco, and the new processor (based on the Pentium III microarchitecture) was instead called the Core 2. The Pentium name is still in use, but is [[DemotedToExtra applied to the mid-end processors that Intel puts out for cheap-ish computers]], somewhere in between the low-end Celerons and the high-end Core line.

to:

* The "Prescott" core Pentium 4 has a reputation for being the worst CPU design in history. It had some design trade-offs which lessened the processor's performance-per-clock over the original Pentium 4 design, but theoretically allowed the Prescott to run at much higher clockspeeds. Unfortunately, these changes also made the Prescott vastly ''hotter'' than the original design -- something that was admittedly exacerbated than their [=90nm=] manufacturing process actually being worse for power consumption than the previous [=130nm=] process when it came to anything clocked higher than around [=3GHz=] -- making it impossible for Intel to actually achieve the clockspeeds they wanted. Moreover, they totally bottlenecked the processor's performance, meaning that Intel's usual performance-increasing tricks (more cache and faster system buses) did nothing to help. By the time Intel came up with a new processor that put them back in the lead, the once hugely valuable Pentium brand had been rendered utterly worthless by the whole Prescott fiasco, and the new processor (based on the Pentium III microarchitecture) was instead called the Core 2. The Pentium name is still in use, but is [[DemotedToExtra applied to the mid-end processors that Intel puts out for cheap-ish computers]], somewhere in between the low-end Celerons and the high-end Core line.



* The Prescott probably deserves the title of worst [=x86=] CPU design ever (although there might be a case for the 80286), but allow us to introduce you to Intel's ''other'' CPU project in the same era: the Itanium. Designed for servers, using a bunch of incredibly-cutting-edge hardware design ideas. Promised to be incredibly fast. The catch? It could only hit that theoretical speed promise if the compiler generated ''perfectly optimized'' machine code for it. It turned out you ''couldn't'' optimize most of the code that runs on servers that hard, because programming languages suck,[[note]]every instruction typed in requires the computer to "think" about it, taking processor speed; optimization simply means the computer needs to think less overall to get the same result[[/note]] and even if you could, the compilers of the time weren't up to it. It also turned out if you ''didn't'' give the thing perfectly optimized code, it ran about half as fast as the Pentium 4 and sucked down twice as much electricity doing it. Did we mention this was right about the time server farm operators started getting serious about cutting their electricity and HVAC bills?
** Making things worse, this was actually Intel's ''third'' attempt at implementing such a design. The failure of their first effort, the [=iAPX-432=], was somewhat forgivable given that it wasn't really possible to achieve what Intel wanted on the manufacturing processes available in the early 1980s. What really should have taught them the folly of their ways came later in the decade with the [=i860=], a much better implementation of what they had tried to achieve with the [=iAPX-432=]... which still happened to be both slower and vastly more expensive than not only the 80386 (bear in mind Intel released the 80'''4'''86 a few months before the [=i860=]) but also the [=i960=], a much simpler and cheaper design which subsequently became the EnsembleDarkhorse of Intel and is still used today in certain roles.

to:

* The Prescott probably deserves the title of worst [=x86=] CPU design ever (although there might be a case for the 80286), but allow us to introduce you to Intel's ''other'' CPU project in the same era: the Itanium. Designed for servers, using a bunch of incredibly-cutting-edge hardware design ideas. Promised to be incredibly fast. The catch? It could only hit that theoretical speed promise if the compiler generated ''perfectly optimized'' machine code for it. It turned out you ''couldn't'' optimize most of the code that runs on servers that hard, because programming languages suck,[[note]]every instruction typed in requires the computer to "think" about it, taking processor speed; optimization simply means the computer needs to think less overall to get the same result[[/note]] and even if you could, the compilers of the time weren't up to it. It also turned out if you ''didn't'' give the thing perfectly optimized code, it ran about half as fast as the Pentium 4 and sucked down twice as much electricity doing it. Did we mention this This was right about the time server farm operators started getting serious about cutting their electricity and HVAC bills?
bills, too.
** Making things worse, this was actually Intel's ''third'' attempt at implementing such a design. The failure of their first effort, the [=iAPX-432=], was somewhat forgivable given that it wasn't really possible to achieve what Intel wanted on the manufacturing processes available in the early 1980s. What really should have taught them the folly of their ways came later in the decade with the [=i860=], a much better implementation of what they had tried to achieve with the [=iAPX-432=]... which still happened to be both slower and vastly more expensive than not only the 80386 (bear in mind Intel (Intel released the 80'''4'''86 a few months before the [=i860=]) but also the [=i960=], a much simpler and cheaper design which subsequently became the EnsembleDarkhorse of Intel and is still used today in certain roles.



*** The failure of the first Itanium was largely down to the horrible cache system that Intel designed for it. While the L1 and L2 caches were both reasonably fast (though the L2 cache was a little on the small side), the L3 cache used the same off-chip cache system designed three years previously for the original Pentium II Xeon. By the time the Itanium had hit the streets however, running external cache chips at CPU speeds just wasn't possible anymore without some compromise, so Intel decided to give them extremely high latency. This proved to be an absolutely disastrous design choice, and basically negated the effects of the cache. Moreover, Itanium instructions are four times larger than [=x86=] ones, leaving the chip strangled between its useless L3 cache, and L1 and L2 caches that weren't big or fast enough to compensate. Most of the improvement in Itanium 2 came from Intel simply making the L1 and L2 caches similar sizes but much faster, and incorporating the L3 cache into the CPU die.

to:

*** The failure of the first Itanium was largely down to the horrible cache system that Intel designed for it. While the L1 and L2 caches were both reasonably fast (though the L2 cache was a little on the small side), the L3 cache used the same off-chip cache system designed three years previously for the original Pentium II Xeon. By the time the Itanium had hit the streets however, running external cache chips at CPU speeds just wasn't possible anymore without some compromise, so Intel decided to give them extremely high latency. This proved to be an absolutely disastrous design choice, and basically negated the effects of the cache. Moreover, Itanium instructions are four times larger than [=x86=] ones, leaving the chip strangled between its useless L3 cache, and L1 and L2 caches that weren't big or fast enough to compensate. Most of the improvement in Itanium 2 came from Intel simply making the L1 and L2 caches similar sizes but much faster, and incorporating the L3 cache into the CPU die.



** Averted wonderfully with Bay Trail, which managed to defeat VIA chips and be equal to AMD low-tier chips (which had a very awful moment with Kabini and Temash) while having decent GPU performance.



** Firstly there was the optional Memory Translator Hub (MTH) component of the 820 chipset, which was supposed to allow the usage of more reasonably-priced SDRAM instead of the uber-expensive RDRAM that the baseline 820 was only compatible with. Unfortunately the MTH basically didn't work at all in this role (causing abysmally poor performance and system instability) and was rapidly discontinued, eventually forcing Intel to create the completely new 815 chipset to provide a more reasonable alternative for consumers.

to:

** Firstly there was the optional Memory Translator Hub (MTH) component of the 820 chipset, which was supposed to allow the usage of more reasonably-priced SDRAM instead of the uber-expensive RDRAM that the baseline 820 was only compatible with. Unfortunately the MTH basically didn't work at all in this role (causing abysmally poor performance and system instability) and was rapidly discontinued, eventually forcing Intel to create the completely new 815 chipset to provide a more reasonable alternative for consumers.



** Speaking of the 820 chipset, anyone remember [[https://en.wikipedia.org/wiki/RDRAM RDRAM]]? It was touted by Intel and Rambus as a high-performance RAM for the Pentium III to be used in conjunction with the 820. But implementation-wise, it was not up to snuff (in fact, benchmarks revealed that applications ran slower with RDRAM than with the older SDRAM!), not to mention very expensive, and third-party chipset makers (such as [=SiS=], who gained some fame during this era) went to cheaper DDR RAM instead (and begrudgingly, so did Intel, leaving Rambus with egg on its face), which ultimately became the de facto industry standard. RDRAM still found use in other applications, though, like the UsefulNotes/Nintendo64 and UsefulNotes/PlayStation2... where it turned out to be one of the biggest performance bottlenecks on both systems -- the [=N64=] had twice the memory (and twice the bandwidth on it) of the UsefulNotes/{{PlayStation}}, but such high latency on it, combined with a ridiculously small buffer for textures loaded into memory to be applied, that it negated those advantages entirely, while the [=PS2=]'s memory, though having twice the clock speed of the UsefulNotes/{{Xbox}}'s older SDRAM, could only afford half as much memory and half the bandwidth on it, contributing to it having [[LoadsAndLoadsOfLoading the longest load times]] of its generation.

to:

** Speaking of the 820 chipset, anyone remember [[https://en.wikipedia.org/wiki/RDRAM RDRAM]]? It RDRAM]] was touted by Intel and Rambus as a high-performance RAM for the Pentium III to be used in conjunction with the 820. But implementation-wise, it was not up to snuff (in fact, benchmarks revealed that applications ran slower with RDRAM than with the older SDRAM!), not to mention and very expensive, and expensive; third-party chipset makers (such as [=SiS=], who gained some fame during this era) went to cheaper DDR RAM instead (and begrudgingly, so did Intel, leaving Rambus with egg on its face), which ultimately became the de facto industry standard. RDRAM still found use in other applications, though, like the UsefulNotes/Nintendo64 and UsefulNotes/PlayStation2... where it turned out to be one of the biggest performance bottlenecks on both systems -- the [=N64=] had twice the memory (and twice the bandwidth on it) of the UsefulNotes/{{PlayStation}}, but such high latency on it, combined with a ridiculously small buffer for textures loaded into memory to be applied, that it negated those advantages entirely, while the [=PS2=]'s memory, though having twice the clock speed of the UsefulNotes/{{Xbox}}'s older SDRAM, could only afford half as much memory and half the bandwidth on it, contributing to it having [[LoadsAndLoadsOfLoading the longest load times]] of its generation.



* Intel's [=SSDs=] have a particular failure mode that rubs people the wrong way: After the drive determines that it's lifespan is up from a certain amount of writes, the drive goes into read-only mode. This is great and all until you consider that the number of writes is often lower compared to other [=SSDs=] of similar technology (up to about 500 terabytes versus 5 petabytes) and that you only have '''one chance''' to read the data off the drive. If you reboot (and chances are, you would because guess what happens when the OS drive suddenly becomes read-only and the system could no longer write to it's swap file? Yep, Windows spazzes out and starts intermittently freezing and would eventually BSOD), the drive then goes to an unusable state, regardless of whether or not the data on the drive is still good. Worst of all is that Intel's head of storage division simply brushes it off, claiming that "the data on the disk is unreliable after it's life is up" and that "people should be making regular backups nightly".
* In early 2018, a hardware bug across all x86-based Intel processors (except the pre-2013 Atoms) released since ''1995'' known as Meltdown caused speculative code (that is, machine code that the CPU predicts that it will need to run and tries while it waits for the "actual" instructions to arrive) to be run before any checks that the code was allowed to run at the required privilege level. This could cause ring-3 code (user-level) to access ring-0 (kernel-level) data. The result? Any kernel developers developing for Intel needed to scramble to patch their page tables so they would do extra babysitting on affected processors. This could cause a 7%-30% performance reduction on ''every single Intel chip in the hands of anyone who had updated their hardware in the last decade'', with performance loss depending on multiple factors. Linux kernel developers considered nicknaming their fix for the bug "Forcefully Unmap Complete Kernel With Interrupt Trampolines" ([[FunWithAcronyms FUCKWIT]]), which is surely the adjective they thought would best describe Intel hardware designers at that time. It subsequently turned out that Intel's weren't the only [=CPUs=] to be vulnerable to this form of attack, but of the major manufacturers, theirs are by far the most severely affected (AMD's current Ryzen line-up is almost totally immune to the exploits, and while ARM processors are also vulnerable they tend to operate in much more locked-down environments than [=x86=] processors, making it harder to push the exploit through). The performance penalties that the fix required ended up ensuring that when AMD released their second-generation Ryzen processors later that year, it turned what would likely have been a fairly even match-up into one where AMD was able to score some very definite victories. It also forced Intel to basically redesign their future chipsets from the ground up to incorporate the fixes into their future processor lines.

to:

* Intel's [=SSDs=] have a particular failure mode that rubs people the wrong way: After the drive determines that it's its lifespan is up from a certain amount of writes, the drive goes into read-only mode. This is great and all until you consider that the number of writes is often lower compared to other [=SSDs=] of similar technology (up to about 500 terabytes versus 5 petabytes) and that you only have '''one chance''' to read the data off the drive. If you reboot (and chances are, are you would would, because guess what happens when the OS drive suddenly becomes read-only and the system could can no longer write to it's its swap file? Yep, Windows spazzes freaks out and starts intermittently freezing and would will eventually BSOD), the drive then goes to an unusable state, regardless of whether or not the data on the drive is still good. Worst of all is that Intel's head of storage division simply brushes brushed it off, claiming that "the data on the disk is unreliable after it's its life is up" and that "people should be making regular backups nightly".
* In early 2018, a hardware bug across all x86-based Intel processors (except the pre-2013 Atoms) released since ''1995'' known as Meltdown caused speculative code (that is, machine code that the CPU predicts that it will need to run and tries while it waits for the "actual" instructions to arrive) to be run before any checks that the code was allowed to run at the required privilege level. This could cause ring-3 code (user-level) to access ring-0 (kernel-level) data. The result? Any kernel developers developing for Intel needed to scramble to patch their page tables so they would do extra babysitting on affected processors. This could cause a 7%-30% performance reduction on ''every single Intel chip in the hands of anyone who had updated their hardware in the last decade'', with performance loss depending on multiple factors. Linux kernel developers considered nicknaming their fix for the bug "Forcefully Unmap Complete Kernel With Interrupt Trampolines" ([[FunWithAcronyms FUCKWIT]]), which is surely the adjective they thought would best describe Intel hardware designers at that time. It subsequently turned out that Intel's weren't the only [=CPUs=] to be vulnerable to this form of attack, but of the major manufacturers, theirs are by far the most severely affected (AMD's current Ryzen line-up is almost totally immune to the exploits, and while ARM processors are also vulnerable they tend to operate in much more locked-down environments than [=x86=] processors, making it harder to push the exploit through). The performance penalties that the fix required ended up ensuring that when AMD released their second-generation Ryzen processors later that year, it turned what would likely have been a fairly even match-up into one where AMD was able to score some very definite victories. It also forced Intel to basically redesign their future chipsets from the ground up to incorporate the fixes into their future processor lines.



** As for why Intel processors were designed processors this way, a [[https://security.stackexchange.com/questions/177100/why-are-amd-processors-not-less-vulnerable-to-meltdown-and-spectre/177101#177101 Stack Exchange post]] offers an answer: the conditions that allow the Meltodwn exploit to happen in the first place are a rare case in practice, and in the name of keeping the hardware as simple as possible, which is practical for a variety of reasons, nothing was done to address this edge case. Not to mention, along with most other security issues, unless there's a practical use for an vulnerability and the effort to exploit it is low enough, it may not be addressed since the chances of it happening are not worth the incoveniences it comes with. As an anology, you can make your home as secure as Fort Knox, but why would you if your home isn't under the same conditions (i.e., holding something valuable and people are aware of it)?
* In the early '90s, to replace the aging 8259 programmable interrupt controller, Intel created what they called the Advanced Programmable Interrupt Controller, or APIC. Configuring the APIC typically required writing to some address space and it was configurable. Since the APIC was backwards-compatible with the 8259 and the 8259 had a fixed location, older software would break if the APIC wasn't located somewhere the software was expecting. So Intel allowed the memory address window of the APIC to be moved. This wouldn't be so much of a problem until they introduced the Intel Management Engine (IME), a standalone processor designed to manage the PC such as cooling and remote access. The problem with the IME is it's basically a master system that can control the CPU so care needs to be taken that it can't be taken over. But security research in 2013 found out that in earlier systems with IME, you could slide the APIC address window over where the address space for IME is located, inject code into it, and do whatever you want with the compromised computer and ''nothing in the computer'' can stop this because the IME has the highest level of access of ''anything'' in the computer. Intel independently found this around the same time, and it has since been fixed by basically denying the APIC address window from sliding over the IME one.

to:

** As for why Intel processors were designed processors this way, a [[https://security.stackexchange.com/questions/177100/why-are-amd-processors-not-less-vulnerable-to-meltdown-and-spectre/177101#177101 Stack Exchange post]] offers an answer: the conditions that allow the Meltodwn Meltdown exploit to happen in the first place are a rare case in practice, and in the name of keeping the hardware as simple as possible, which is practical for a variety of reasons, nothing was done to address this edge case. Not to mention, along Along with most other security issues, unless there's a practical use for an vulnerability and the effort to exploit it is low enough, it may not be addressed since the chances of it happening are not worth the incoveniences inconveniences it comes with. As an anology, analogy, you can make your home as secure as Fort Knox, but why would you there's no reason to do so if your home isn't under the same conditions (i.e., holding something valuable and people are aware of it)?
it).
* In the early '90s, to replace the aging 8259 programmable interrupt controller, Intel created what they called the Advanced Programmable Interrupt Controller, or APIC. Configuring the APIC typically required writing to some address space and it was configurable. Since the APIC was backwards-compatible with the 8259 and the 8259 had a fixed location, older software would break if the APIC wasn't located somewhere the software was expecting. So Intel allowed the memory address window of the APIC to be moved. This wouldn't be so much of a problem until they introduced the Intel Management Engine (IME), a standalone processor designed to manage the PC such as cooling and remote access. The problem with the IME is it's basically a master system that can control the CPU so care needs to be taken that it can't be taken over. But security research in 2013 found out that in earlier systems with IME, you could slide the APIC address window over where the address space for IME is located, inject code into it, and do whatever you want with the compromised computer and ''nothing in the computer'' can stop this because the IME has the highest level of access of ''anything'' in the computer. Intel independently found this around the same time, and it has since been fixed by basically denying the APIC address window from sliding over the IME one.



** Things got better with Phenom II, which improved dramatically the performance, going near Intel once again and even more, their 6-core chips were good enough to get some buyers (and even defeating in some uses the first generation of Core i7), indicating that the original Phenom was an otherwise-sound design brought down by poor clock speeds, the TLB glitch, and not enough cache. Which is still more than can be said for the next major design...

to:

** Things got better with Phenom II, which improved dramatically the performance, going near Intel once again and even more, their 6-core chips were good enough to get some buyers (and even defeating in some uses the first generation of Core i7), i7 in some uses), indicating that the original Phenom was an otherwise-sound design brought down by poor clock speeds, the TLB glitch, and not enough cache. Which is still more than can be said for the next major design...



** When AMD announced the RX 6500 XT, they further reduced the lane count to 4 and there's only a 4GB VRAM version. [[https://www.youtube.com/watch?v=rGG2GYwnhMs A reviewer]] tested the RX 5500 XT with 4 lanes to simulate how bad this could be and found performance drops by a staggering ''~40%''. While PCI Express 4.0 has had wider adoption at the time of its release, the card is more attractive to people with older systems that have 3.0 slots. When the card launched, reviewers were not kind to it, noting that the card it was meant to replace, the RX 5500 XT, could still beat it and it even loses to NVIDIA's previous generation card in the similar market category. It also lacked certain multimedia features such as [=AV1=] decoding support. "Technical staff" from AMD went onto forums to explain that the RX 6500 XT was meant to be used in laptops, pairing it with one of their upcoming [=APUs=] that included the missing features, with the strong inference that it was only released on desktops due to the supply issues facing the market.[[note]](Something that disproportionately affected AMD considering that they had eschewed on-board [=GPUs=] on the majority of their CPU line-up in favor of adding more cores, something which came back to bite them on the butt -- though, in fairness, in a way they could never have reasonably predicted -- when the combination of the UsefulNotes/COVID19Pandemic and the cryptocurrency explosion made it all-but-impossible to buy affordable graphics cards)[[/note]] Even worse, because the GPU can only support two VRAM chips, chances for an 8GB version were pretty much gone since 2GB chips were the maximum size available with 4GB chips not on anyone's roadmap.

to:

** When AMD announced the RX 6500 XT, they further reduced the lane count to 4 and there's only a 4GB VRAM version. [[https://www.youtube.com/watch?v=rGG2GYwnhMs A reviewer]] tested the RX 5500 XT with 4 lanes to simulate how bad this could be and found performance drops by a staggering ''~40%''. While PCI Express 4.0 has had wider adoption at the time of its release, the card is more attractive to people with older systems that have 3.0 slots. When the card launched, reviewers were not kind to it, noting that the card it was meant to replace, the RX 5500 XT, could still beat it and it even loses to NVIDIA's previous generation card in the similar market category. It also lacked certain multimedia features such as [=AV1=] decoding support. "Technical staff" from AMD went onto forums to explain that the RX 6500 XT was meant to be used in laptops, pairing it with one of their upcoming [=APUs=] that included the missing features, with the strong inference that it was only released on desktops due to the supply issues facing the market.[[note]](Something that disproportionately affected AMD considering that they had eschewed on-board [=GPUs=] on the majority of their CPU line-up in favor of adding more cores, something which came back to bite them on the butt -- though, in fairness, in a way they could never have reasonably predicted -- when the combination of the UsefulNotes/COVID19Pandemic and the cryptocurrency explosion made it all-but-impossible to buy affordable graphics cards)[[/note]] Even worse, because the GPU can only support two VRAM chips, chances there was little if any chance for an 8GB version were pretty much gone since 2GB chips were the maximum size available with 4GB chips not on anyone's roadmap.



* The Coleco Adam, a 1983 computer based on the fairly successful UsefulNotes/ColecoVision console, suffered from a host of problems and baffling design decisions. Among the faults were the use of a proprietary tape drive which was prone to failure, locating the whole system's power supply in the ''printer'' of all places (meaning the very limited daisy-wheel printer couldn't be replaced, and if it broke down or was absent, the whole computer was rendered unusable), and poor electromagnetic shielding which could lead to tapes and disks being ''erased'' at startup. Even after revised models ironed out the worst bugs, the system was discontinued after less than 2 years and sales of 100,000 units.

to:

* The Coleco Adam, a 1983 computer based on the fairly successful UsefulNotes/ColecoVision console, suffered from a host of problems and baffling design decisions. Among the faults were the use of a proprietary tape drive which was prone to failure, locating the whole system's power supply in the ''printer'' of all places (meaning the very limited daisy-wheel printer couldn't be replaced, and if it broke down or was absent, the whole computer was rendered unusable), and poor electromagnetic shielding which could lead to tapes and disks being ''erased'' at startup. Even after revised models ironed out the worst bugs, the system was discontinued after less than 2 years and sales of 100,000 units.



* The Samsung Galaxy Fold would have been released in April 2019 had catastrophic teething problems not come to light just days before release. On the surface, it was revolutionary: a smartphone that could be folded up like the flip phones of the '90s and 2000s, allowing a pocket-sized device to have a 7.3-inch screen like a small tablet computer. Unfortunately, reviewers' $1,980 Galaxy Note phones started [[https://www.tomsguide.com/us/galaxy-fold-problems-bad-for-foldable-phones,news-29895.html breaking]] just a few days after they got them, in what was possibly the worst possible first impression for foldable smartphones. Many of the problems could be traced to people removing the protective film on the screen - a film that, in a true case of this trope, looked almost exactly like the thin plastic screen protectors that normally ship on the screens of other smartphones and tablets to keep them from getting scratched or accumulating dust during transport, leaving many to wonder just why Samsung not only made a necessary part of the phone so easy to remove, but made it look so similar to a part of other phones that is designed to be removed before use.

to:

* The Samsung Galaxy Fold would have been released in April 2019 had catastrophic teething problems not come to light just days before release. On the surface, it was revolutionary: a smartphone that could be folded up like the flip phones of the '90s and 2000s, allowing a pocket-sized device to have a 7.3-inch screen like a small tablet computer. Unfortunately, reviewers' $1,980 Galaxy Note phones started [[https://www.tomsguide.com/us/galaxy-fold-problems-bad-for-foldable-phones,news-29895.html breaking]] just a few days after they got them, in what was possibly one of the worst possible first impression impressions for foldable smartphones. Many of the problems could be traced to people removing the protective film on the screen - a film that, in a true case of this trope, looked almost exactly like the thin plastic screen protectors that normally ship on the screens of other smartphones and tablets to keep them from getting scratched or accumulating dust during transport, leaving many to wonder just why Samsung not only made a necessary part of the phone so easy to remove, but made it look so similar to a part of other phones that is designed to be removed before use.



* The infamous A20 line. Due to the quirk in how its addressing system worked [[note]](basically, they've skipped on the bounds check there)[[/note]], Intel's 8088/86 [=CPUs=] could theoretically address slightly more than their advertised 1 MB. But because they ''physically'' still had only 20 address pins, the resulting address just wrapped over, so the last 64K of memory actually was the same as the first. Some early programmers [[note]](among them, Microsoft; the CALL 5 entry point in MS-DOS relies on this behavior)[[/note]] were, unsurprisingly, stupid enough to use this almost-not-a-bug [[IdiotBall as a feature]]. So, when the 24-bit 80286 rolled in, a problem arose - nothing wrapped anymore. In a truly stellar example of a "compatibility is God" thinking, IBM engineers couldn't think up anything better than to simply ''block'' the offending 21st pin (the aforementioned A20 line) on the motherboard side, making the 286 unable to use a solid chunk of its memory above 1 meg until this switch was turned on. This might have been an acceptable (if very clumsy) solution had IBM defaulted to having the A20 line enabled and provided an option to disable it when needed, but instead they decided to have it always turned off unless the OS specifically enables it. By the 386 times, no sane programmer used that "wrapping up" trick any more, but turning the A20 line on is ''still'' among the very first things any PC OS has to do. It wasn't until Intel introduced the Core i7 in 2008 that they finally decided "screw it" and locked the A20 line into being permanently enabled.

to:

* The infamous A20 line. Due to the quirk in how its addressing system worked [[note]](basically, they've [[note]](they've skipped on the bounds check there)[[/note]], Intel's 8088/86 [=CPUs=] could theoretically address slightly more than their advertised 1 MB. But because they ''physically'' still had only 20 address pins, the resulting address just wrapped over, so the last 64K of memory actually was the same as the first. Some early programmers [[note]](among them, Microsoft; the CALL 5 entry point in MS-DOS relies on this behavior)[[/note]] were, unsurprisingly, stupid enough to use this almost-not-a-bug [[IdiotBall as a feature]]. So, when the 24-bit 80286 rolled in, a problem arose - nothing wrapped anymore. In a truly stellar example of a "compatibility is God" thinking, IBM engineers couldn't think up anything better than to simply ''block'' the offending 21st pin (the aforementioned A20 line) on the motherboard side, making the 286 unable to use a solid chunk of its memory above 1 meg until this switch was turned on. This might have been an acceptable (if very clumsy) solution had IBM defaulted to having the A20 line enabled and provided an option to disable it when needed, but instead they decided to have it always turned off unless the OS specifically enables it. By the 386 times, no sane programmer used that "wrapping up" trick any more, but turning the A20 line on is ''still'' among the very first things any PC OS has to do. It wasn't until Intel introduced the Core i7 in 2008 that they finally decided "screw it" and locked the A20 line into being permanently enabled.



** The software to program virtually any Motorola radio, even newer ones, is absolutely ancient. You can only connect via serial port. An actual serial port - USB to serial adapter generally won't work. And the system it's running on has to be basically stone age (Pentium Is from 1993 are generally too fast), meaning that in most radio shops there's a 486 in the corner just for programming them. Even the new XPR line can't generally be programmed with a computer made after 2005 or so.

to:

** The software to program virtually any Motorola radio, even newer ones, is absolutely ancient. You can only connect via serial port. An actual serial port - USB to serial adapter generally won't work. And the system it's running on has to be basically stone age (Pentium Is from 1993 are generally too fast), meaning that in most radio shops there's a 486 in the corner just for programming them. Even the new XPR line can't generally be programmed with a computer made after 2005 or so.



** The Floppy Disk Controller chip used in the Model I EI could only read and write Single Density disks. Soon afterwards a new FDC chip became available which could read and write Double Density (a more efficient encoding method that packs 80% more data in the same space). The new FDC chip was ''almost'' pin-compatible with the old one, but not quite. One of the values written to the header of each data sector on the disk was a 2-bit value called the "Data Address Mark". 2 pins on the single-density FDC chip were used to specify this value. As there were no spare pins available on the DD FDC chip, one of these pins was reassigned as the "density select" pin. Therefore the DD FDC chip could only write the first 2 of the 4 possible DAM values. Guess which value TRS-DOS used? Several companies (starting with Percom, and eventually even Radio Shack themselves) offered "doubler" adapters - a small circuit board containing sockets for ''both'' FDC chips! To install the doubler, you had to remove the SD FDC chip from the EI, plug it into the empty socket on the doubler PCB, then plug the doubler into the vacated FDC socket in the EI. Logic on the doubler board would select the correct FDC chip.

to:

** The Floppy Disk Controller chip used in the Model I EI could only read and write Single Density disks. Soon afterwards a new FDC chip became available which could read and write Double Density (a more efficient encoding method that packs 80% more data in the same space). The new FDC chip was ''almost'' pin-compatible with the old one, but not quite. One of the values written to the header of each data sector on the disk was a 2-bit value called the "Data Address Mark". 2 Two pins on the single-density FDC chip were used to specify this value. As there were no spare pins available on the DD FDC chip, one of these pins was reassigned as the "density select" pin. Therefore the DD FDC chip could only write the first 2 two of the 4 four possible DAM values. Guess which value TRS-DOS used? Several companies (starting with Percom, and eventually even Radio Shack themselves) offered "doubler" adapters - a small circuit board containing sockets for ''both'' FDC chips! To install the doubler, you had to remove the SD FDC chip from the EI, plug it into the empty socket on the doubler PCB, then plug the doubler into the vacated FDC socket in the EI. Logic on the doubler board would select the correct FDC chip.



* Back in the early days of 3D Graphics cards, when they were called 3D Accelerators and even 3Dfx hadn't found their stride, there was the S3 Virge. The card had good 2D performance, but such a weak 3D chip that at least one reviewer called it, with good reason, the world's first 3D Decelerator. That epithet is pretty much ExactlyWhatItSaysOnTheTin, as 3D games performed worse on [=PCs=] with an S3 Virge installed than they did in software mode, i.e. with no 3D acceleration at all.
* The "Home Hub" series of routers provided by UK telecom giant BT are fairly capable devices for the most part, especially considering that they usually come free to new customers. Unfortunately, they suffer from a serious flaw in that they expect to be able to use Wi-Fi channels 1, 5, or 11, which are naturally very crowded considering the ubiquity of home Wi-Fi, and BT's routers in particular. And when that happens, the routers will endlessly rescan in an effort to get better speeds, knocking out your internet connection for 10-30 seconds every 20 minutes or so. Sure, you can manually force the router into using another, uncongested channel... except that it'll keep rescanning based on how congested channels 1, 5, and 11 are, even if there are ''no devices whatsoever'' on the channel that you set manually. Even BT's own advice is to use ethernet (and a powerline adapter if needed) for anything that you actually need a rock-solid connection on.

to:

* Back in the early days of 3D Graphics cards, when they were called 3D Accelerators and even companies like 3Dfx hadn't found their stride, there was the S3 Virge. The card had good 2D performance, but such a weak 3D chip that at least one reviewer called it, with good reason, the world's first 3D Decelerator. That epithet is pretty much ExactlyWhatItSaysOnTheTin, as 3D games performed worse on [=PCs=] with an S3 Virge installed than they did in software mode, i.e. with no 3D acceleration at all.
* The "Home Hub" series of routers provided by UK telecom giant BT are fairly capable devices for the most part, especially considering that they usually come free to new customers. Unfortunately, they suffer from a serious flaw in that they expect to be able to use Wi-Fi channels 1, 5, or 11, which are naturally very crowded considering the ubiquity of home Wi-Fi, and BT's routers in particular. And when that happens, the routers will endlessly rescan in an effort to get better speeds, knocking out your internet connection for 10-30 seconds every 20 minutes or so. Sure, you can manually force the router into using another, uncongested channel... except that it'll keep rescanning based on how congested channels 1, 5, and 11 are, even if there are ''no devices whatsoever'' on the channel that you set manually. Even BT's own advice is to use ethernet (and a powerline adapter if needed) for anything that you actually need a rock-solid connection on.



* Sony's [=HiFD=] "floptical" drive system. The Zip Drive and the LS-120 Superdrive had already attempted to displace the aging [=1.44MB=] floppy, but many predicted that the [=HiFD=] would be the real deal. At least until it turned out that Sony had utterly screwed up the [=HiFD=]'s write head design, which caused performance degradation, hard crashes, data corruption, and all sorts of other nasty problems. They took the drive off the market, then bought it back a year later... in a new 200MB version that was totally incompatible with disks used by the original 150MB version (and 720KB floppies as well), since the original [=HiFD=] design was so badly messed up that they couldn't maintain compatibility and make the succeeding version actually work. Sony has made a lot of weird, proprietary formats that have failed to take off for whatever reason, but the [=HiFD=] has to go down as the worst of the lot.

to:

* Sony's [=HiFD=] "floptical" drive system. The Zip Drive and the LS-120 Superdrive had already attempted to displace the aging [=1.44MB=] floppy, but many predicted that the [=HiFD=] would be the real deal. At least until it turned out that Sony had utterly screwed up the [=HiFD=]'s write head design, which caused performance degradation, hard crashes, data corruption, and all sorts of other nasty problems. They took the drive off the market, then bought it back a year later... in a new 200MB version that was totally incompatible with disks used by the original 150MB version (and 720KB floppies as well), since the original [=HiFD=] design was so badly messed up that they couldn't maintain compatibility and make the succeeding version actually work. Sony has made a lot of weird, proprietary formats that have failed to take off for whatever reason, but the [=HiFD=] has to go down as the worst of the lot.



** There is anecdotal evidence that IBM was engaging in deception, knowingly selling faulty products, and then spewing out rhetoric about the industry-standard failure rates of hard drives. This denial strategy started a chain reaction that led to a demise in customer confidence. Class-action lawsuits helped convince IBM to sell their hard drive division to Hitachi in 2002[[note]][[http://www.pcworld.co.nz/article/157888/25_worst_tech_products_all_time/ (See "Top 25 Worst Tech Products of All Time" for this and more.)]][[/note]], and possibly to exit the consumer market alltogether in 2006 by selling their consumer division to Lenovo.

to:

** There is anecdotal evidence that IBM was engaging in deception, knowingly selling faulty products, and then spewing out rhetoric about the industry-standard failure rates of hard drives. This denial strategy started a chain reaction that led to a demise in customer confidence. Class-action lawsuits helped convince IBM to sell their hard drive division to Hitachi in 2002[[note]][[http://www.pcworld.co.nz/article/157888/25_worst_tech_products_all_time/ (See "Top 25 Worst Tech Products of All Time" for this and more.)]][[/note]], 2002, and possibly to exit the consumer market alltogether altogether in 2006 by selling their consumer division to Lenovo.



** One more idiot move - Iomega decided that "if you can't beat them, join them" in the early 2000s and released the Zip CD line of CD burners. However, due to bad luck, they unknowingly sourced batches of bad drives from Philips. This resulted in more coasters than you could've gotten over several years' worth of AOL free trial discs in your mailbox - apparently their QC department weren't doing their jobs. Another scathing lawsuit and product replacement program later, they quickly flipped distributors to Plextor. Except that by then, CD technology was improving and newer [=CD-RWs=] could store 700MB (and even later, 800MB) of data. On those Plextor drives, the extra 50MB-150MB is walled off and you can still only write 650MB of data using their drives when you could use that sweet extra space (150MB was a lot even in 2007) on other drives. This eventually caused them to be bought out by EMC Corp. Which later was restructured into a JV with conglomerate Lenovo.

to:

** One more idiot move - Iomega decided that "if you can't beat them, join them" in the early 2000s and released the Zip CD line of CD burners. However, due to bad luck, they unknowingly sourced batches of bad drives from Philips. This resulted in more coasters than you could've gotten over several years' worth of AOL free trial discs in your mailbox - apparently their QC department weren't doing their jobs. Another scathing lawsuit and product replacement program later, they quickly flipped distributors to Plextor. Except that by then, CD technology was improving and newer [=CD-RWs=] could store 700MB (and even later, 800MB) of data. On those Plextor drives, the extra 50MB-150MB is walled off and you can still only write 650MB of data using their drives when you could use that sweet extra space (150MB was a lot even in 2007) on other drives. This eventually caused them to be bought out by EMC Corp. Which later was restructured into a JV with conglomerate Lenovo.



* The 3M Superdisk and its proprietary 120MB "floptical" media were intended as direct competition to the Iomega Zip, but in order to penetrate a market that Iomega owned pretty tightly, the Superdisk needed a special feature to push it ahead. That feature was the possibility to write up to 32 megabytes on a bog-standard 1.44MB floppy, using lasers for alignment of the heads. Back then 32MB was significant storage, and people really liked the idea of recycling existing floppy stock - of which everybody had large amounts - into high-capacity media. The feature might just have given the Superdisk the edge it needed; unfortunately what wasn't immediately clear, nor explicitly stated, was that the drive was only able to do random writes on its specific 120MB disks. It could indeed write 32MB on floppies, but only if you rewrote '''all''' the data every time a change, no matter how small, was made - basically like a CD-RW disk with no packet-writing system. This took a relatively long time, and transformed the feature into an annoying gimmick. Disappointment ensued, and the format didn't even dent Iomega's empire before disappearing.

to:

* The 3M Superdisk and its proprietary 120MB "floptical" media were intended as direct competition to the Iomega Zip, but in order to penetrate a market that Iomega owned pretty tightly, the Superdisk needed a special feature to push it ahead. That feature was the possibility to write up to 32 megabytes on a bog-standard 1.44MB floppy, using lasers for alignment of the heads. Back then 32MB was significant storage, and people really liked the idea of recycling existing floppy stock - of which everybody had large amounts - into high-capacity media. The feature might just have given the Superdisk the edge it needed; unfortunately what wasn't immediately clear, nor explicitly stated, was that the drive was only able to do random writes on its specific 120MB disks. It could indeed write 32MB on floppies, but only if you rewrote '''all''' the data every time a change, no matter how small, was made - basically like a CD-RW disk with no packet-writing system. This took a relatively long time, and transformed the feature into an annoying gimmick. Disappointment ensued, and the format didn't even dent Iomega's empire before disappearing.



* After solid-state drives started taking over from mechanical hard drives as the storage device of choice for high-end users, it quickly became obvious that the transfer speeds would soon be bottlenecked by the speed of the Serial ATA standard, and that PCI Express was the obvious solution. Using it in the form of full-sized cards wasn't exactly optimal, though, and the smaller M.2 form factor is thermally limited and can be fiddly to install cards in. The chipset industry's answer was SATA Express, a clunky solution which required manufacturers to synchronise data transfers over two lanes of PCI Express and two SATA ports, standards with ''completely'' different ways of working. Just to make it even worse, the cable was an ugly mess consisting of four separate wires (two SATA, one PCI-E, and a SATA power connector that hung off the end of the cable). The end result was one of the most resounding failures of an industry standard in computing history, as a grand total of ''zero'' storage products made use of it (albeit a couple of manufacturers jury-rigged it into a way of connecting front-panel [=USB3.1=] ports), with SSD manufacturers instead flocking to the SFF-8639 (later renamed U.2) connector, essentially just four PCI-E lanes crammed into a simple cable.
* To call the Kingston [=HyperX=] Fury RGB SSD a perfect example of form over function would be a lie by omission, as with this SSD form ''actively affects'' function. Kingston thought it would be a good idea to cram ''75 [=LEDs=]'' into a 2.5-inch enclosure without insulating the storage aspect or, apparently, adequately testing the thermals, and the result is catastrophic. The heat from the [=LEDs=] - potentially ''over 70 degrees celsius'' - causes extreme thermal throttling that, as shown in [[https://www.youtube.com/watch?v=vnST5rA64Oc this video]], causes performance issues that can prevent programs from starting and even ''cause the computer to hang on boot''; the uploader also speculated that it could corrupt data. The thermal throttling can get so bad that a gigantic fan is needed to cool the drive enough to be able to turn the LED array off in software, at which point you might as well buy a normal SSD and leave the gimmicky RGB lighting separate from anything where performance is important. And before you ask "Well, why can't I just unplug the [=LEDs=]?", that just causes the thermonuclear reaction happening in your primary storage device to default to red and removes any control you have over it, because it is powered through the ''drive's'' power connector. [[ItWontTurnOff Assuming it isn't powered by]] [[VideoGame/{{Doom3}} demons]].

to:

* After solid-state drives started taking over from mechanical hard drives as the storage device of choice for high-end users, it quickly became obvious that the transfer speeds would soon be bottlenecked by the speed of the Serial ATA standard, and that PCI Express was the obvious solution. Using it in the form of full-sized cards wasn't exactly optimal, though, and the smaller M.2 form factor is thermally limited and can be fiddly to install cards in. The chipset industry's answer was SATA Express, a clunky solution which required manufacturers to synchronise data transfers over two lanes of PCI Express and two SATA ports, standards with ''completely'' different ways of working. Just to make it even worse, the cable was an ugly mess consisting of four separate wires (two SATA, one PCI-E, and a SATA power connector that hung off the end of the cable). The end result was one of the most resounding failures of an industry standard in computing history, as a grand total of ''zero'' storage products made use of it (albeit a couple of manufacturers jury-rigged it into a way of connecting front-panel [=USB3.1=] ports), with SSD manufacturers instead flocking to the SFF-8639 (later renamed U.2) connector, essentially which was just four PCI-E lanes crammed into a simple cable.
* To call the Kingston [=HyperX=] Fury RGB SSD a perfect example of form over function would be a lie by omission, as with this SSD form ''actively affects'' function. Kingston thought it would be a good idea to cram ''75 [=LEDs=]'' into a 2.5-inch enclosure without insulating the storage aspect or, apparently, adequately testing the thermals, and the result is catastrophic. The heat from the [=LEDs=] - potentially ''over 70 degrees celsius'' - causes extreme thermal throttling that, as shown in [[https://www.youtube.com/watch?v=vnST5rA64Oc this video]], causes performance issues that can prevent programs from starting and even ''cause the computer to hang on boot''; the uploader also speculated that it could corrupt data. The thermal throttling can get so bad that a gigantic fan is needed to cool the drive enough to be able to turn the LED array off in software, at which point you might as well buy a normal SSD and leave the gimmicky RGB lighting separate from anything where performance is important. And before you ask "Well, why can't I just unplug the [=LEDs=]?", that just causes the thermonuclear reaction happening in your primary storage device to default to red and removes any control you have over it, because it is powered through the ''drive's'' power connector. [[ItWontTurnOff Assuming it isn't powered by]] [[VideoGame/{{Doom3}} demons]].



*** The main hardware seemed to be designed with little foresight or consideration to developers. The "Tom" GPU could do texture-mapping, but it couldn't do it well (even the official documentation admits that texture-mapping slows the system to a crawl) thanks to VRAM access not being fast enough. This is the reason behind the flat, untextured look of many [=3D=] Jaguar games and meant that the system could not stand up graphically to even the [[UsefulNotes/ThreeDOInteractiveMultiplayer 3DO]] which came out around the same time, let alone the UsefulNotes/Nintendo64, Creator/{{Sony}} UsefulNotes/PlayStation, or even the UsefulNotes/SegaSaturn, basically [[TechnologyMarchesOn dooming the system to obsolescence right out of the gate.]] Yes, texture-mapping in video games was a fairly new thing in 1993 with the release of games like ''VideoGame/RidgeRacer'', but the 3DO came out a month prior to the Jaguar with full texture-mapping capabilities, so one would think someone at Atari would catch on and realize that texture-mapped graphics were the future.

to:

*** The main hardware seemed to be designed with little foresight or consideration to developers. The "Tom" GPU could do texture-mapping, but it couldn't do it well (even the official documentation admits that texture-mapping slows the system to a crawl) thanks to VRAM access not being fast enough. This is the reason behind the flat, untextured look of many [=3D=] Jaguar games and meant that the system could not stand up graphically to even the [[UsefulNotes/ThreeDOInteractiveMultiplayer 3DO]] which came out around the same time, let alone the UsefulNotes/Nintendo64, Creator/{{Sony}} UsefulNotes/PlayStation, or even the UsefulNotes/SegaSaturn, basically [[TechnologyMarchesOn dooming the system to obsolescence right out of the gate.]] Yes, texture-mapping in video games was a fairly new thing in 1993 with the release of games like ''VideoGame/RidgeRacer'', but the 3DO came out a month prior to the Jaguar with full texture-mapping capabilities, so one would think someone at Atari would catch on and realize that texture-mapped graphics were the future.



*** The controller only included three main action buttons, a configuration which was already causing issues for the UsefulNotes/SegaGenesis at the time. In a baffling move, the controller also featured a numeric keypad, something that Atari had last done on the 5200. On that occasion the keypad was pretty superfluous and generally ignored by developers, but it was only taking up what would probably have been unused space on the controller, so it didn't do any harm by being there. The Jaguar's keypad, on the other hand, was far bigger, turning the controller into an ungodly monstrosity that has often been ranked as the absolute worst videogame controller of all-time.[[note]]Its main competition coming ironically from the 5200 controller, though that one's more often given a pass on the grounds that it would have been decent if Atari hadn't cut so many corners, and that by 1993 the company definitely should have known better.[[/note]] Atari later saw sense and produced a revised controller that added in three more command buttons and shoulder buttons, but for compatibility reasons they couldn't ditch the keypad - in fact, the five new buttons were just remaps of five of the keypad buttons - meaning that the newer version was similarly uncomfortable. Note that the Jaguar's controller was in fact designed originally for the Atari Panther, their unreleased 32-bit console that was scheduled to come out in 1991 before it became obvious that the Genesis' 3-button configuration wasn't very future-proof. They evidently figured that the keypad gave them more than enough buttons and didn't bother creating a new controller for the Jaguar, a decision that would prove costly.

to:

*** The controller only included three main action buttons, a configuration which was already causing issues for the UsefulNotes/SegaGenesis at the time. In a baffling move, the controller also featured a numeric keypad, something that Atari had last done on the 5200. On that occasion the keypad was pretty superfluous and generally ignored by developers, but it was only taking up what would probably have been unused space on the controller, so it didn't do any harm by being there. The Jaguar's keypad, on the other hand, was far bigger, turning the controller into an ungodly monstrosity that has often been ranked as the absolute worst videogame controller of all-time.[[note]]Its main competition coming ironically from the 5200 controller, though that one's more often given a pass on the grounds that it would have been decent if Atari hadn't cut so many corners, and that by 1993 the company definitely should have known better.[[/note]] Atari later saw sense and produced a revised controller that added in three more command buttons and shoulder buttons, but for compatibility reasons they couldn't ditch the keypad - in fact, the five new buttons were just remaps of five of the keypad buttons - meaning that the newer version was similarly uncomfortable. Note that the The Jaguar's controller was in fact designed originally for the Atari Panther, their unreleased 32-bit console that was scheduled to come out in 1991 before it became obvious that the Genesis' 3-button configuration wasn't very future-proof. They evidently figured that the keypad gave them more than enough buttons and didn't bother creating a new controller for the Jaguar, a decision that would prove costly.



*** Microsoft recommends to not have the original UsefulNotes/XboxOne model in any position other than horizontal because the optical drive isn't designed for any orientation other than that. Note that ''every'' 360 model was rated to work in vertical orientation, even with the aforementioned scratching problem, and Microsoft quickly restored support for vertical orientation with the updated Xbox One S model.
** Most of the 360's problems stem from the inexplicable decision to use a full-sized desktop DVD drive, which even in the larger original consoles took almost a quarter of their internal volume. Early models also had four rather large chips on the motherboard, due to the 90 nm manufacturing process, which also made them run quite hot (especially the [[UsefulNotes/GraphicsProcessingUnit GPU]]-[[UsefulNotes/VideoRAM VRAM]] combo that doubled as a northbridge). But the relative positions of the GPU and the drive (and the latter's bulk) meant that there simply wasn't any room to put any practical heatsink. Microsoft tried to address this problem in two separate motherboard redesigns, the first of which finally added at least ''some'' heatsink, but it was only a third, when the chipset was shrunk to just two components, which allowed designers to completely reshuffle the board and even add a little fan atop the new, large heatsink, which finally did away with the problem somewhat. However, even the Slim version still uses that hugeass desktop DVD drive, which still has ''no'' support for the disk, perpetuating the scratching problem.

to:

*** Microsoft recommends to not have the original UsefulNotes/XboxOne model in any position other than horizontal because the optical drive isn't designed for any orientation other than that. Note that ''every'' ''Every'' 360 model was rated to work in vertical orientation, even with the aforementioned scratching problem, and Microsoft quickly restored support for vertical orientation with the updated Xbox One S model.
** Most of the 360's problems stem from the inexplicable decision to use a full-sized desktop DVD drive, which even in the larger original consoles took almost a quarter of their internal volume. Early models also had four rather large chips on the motherboard, due to the 90 nm manufacturing process, which also made them run quite hot (especially the [[UsefulNotes/GraphicsProcessingUnit GPU]]-[[UsefulNotes/VideoRAM VRAM]] combo that doubled as a northbridge). But the relative positions of the GPU and the drive (and the latter's bulk) meant that there simply wasn't any room to put any practical heatsink. Microsoft tried to address this problem in two separate motherboard redesigns, the first of which finally added at least ''some'' heatsink, but it was only a third, when the chipset was shrunk to just two components, which allowed designers to completely reshuffle the board and even add a little fan atop the new, large heatsink, which finally did away with the problem somewhat. However, even the Slim version still uses that hugeass desktop DVD drive, which still has ''no'' support for the disk, perpetuating the scratching problem.



** Early in the life of the 360, many gamers used the console's optional VGA cable to play their games with HD graphics, as true [=HDTVs=] tended to be rare and expensive back then. PC monitors at the time usually had a 4:3 aspect ratio, which most game engines were smart enough to handle by simply sticking black bars at the top and bottom of the screen, with a few even rendering natively at the right resolution. However, some engines (including the one used for ''VideoGame/NeedForSpeed Most Wanted'' and ''Carbon'') instead rendered the game in 480p - likely the only 4:3 resolution they supported - and upscaled the output. Needless to say, playing a 480p game stretched to a higher resolution [[note]](1280×1024 being the most common resolution for mid-range monitors at the time - which you may notice is actually a 5:4 aspect ratio instead of 4:3, meaning that you had pixel distortion to deal with on top of other things)[[/note]] looked awful, and arguably even worse than just playing it on an SDTV.

to:

** Early in the life of the 360, many gamers used the console's optional VGA cable to play their games with HD graphics, as true [=HDTVs=] tended to be rare and expensive back then. PC monitors at the time usually had a 4:3 aspect ratio, which most game engines were smart enough to handle by simply sticking black bars at the top and bottom of the screen, with a few even rendering natively at the right resolution. However, some engines (including the one used for ''VideoGame/NeedForSpeed Most Wanted'' and ''Carbon'') instead rendered the game in 480p - likely the only 4:3 resolution they supported - and upscaled the output. Needless to say, playing a 480p game stretched to a higher resolution [[note]](1280×1024 being the most common resolution for mid-range monitors at the time - which you may notice is actually a 5:4 aspect ratio instead of 4:3, meaning that you had pixel distortion to deal with on top of other things)[[/note]] looked awful, and arguably even worse than just playing it on an SDTV.



* The UsefulNotes/{{Wii}} literally has no crash handler. So if you manage to crash your system, you open it up to Arbitrary Code Execution, and a whole load of security vulnerabilities await you. Do you have an SD card inserted? Well, crash any game that reads and writes to it and even ''more'' vulnerabilities open up. They'll tell you that they fixed these vulnerabilities though system updates, but in reality, they never did. In fact, the only thing these updates did on that matter was simply remove anything that was installed ''with'' these vulnerabilities - nothing's stopping you from using these vulnerabilities again to ''re''-install them. [[GoodBadBugs Of course, all of this is a good thing if you like modding your console.]]
** While the UsefulNotes/WiiU ultimately proved a failure for several reasons, poor component choices helped contribute to its near-total lack of third-party support. It'd only be a slight exaggeration to say that the system's CPU was essentially just three heavily overclocked Wii [=CPUs=] -- the Wii's own CPU in turn being just a higher-clocked version of the CPU that had been used a decade earlier in the UsefulNotes/NintendoGameCube -- slapped together on the same die, with performance that was abysmally poor by 2012 standards. Its GPU, while not ''as'' slow, wasn't all that much faster than those of the [=PS3=] and Xbox 360,[[note]]in fact, on paper the Wii U's GPU was actually ''slower'' than the [=GPUs=] of either of those consoles, but due to advances in technology was about twice as fast in practice... unless you offloaded code to the GPU to offset the lack of CPU grunt, which would bring it back down to, if not below, [=PS3/360=] levels[[/note]] and used a shader model in-between those of the older consoles and their successors, meaning that ported [=PS3=]/360 games didn't take advantage of the newer hardware, while games designed for the [=PS4=] and Xbox One wouldn't even work to begin with due to the lack of necessary feature support. While Nintendo likely stuck with the [=PowerPC=] architecture for UsefulNotes/BackwardsCompatibility reasons, the system would likely have fared much better if Nintendo had just grabbed an off-the-shelf AMD laptop APU - which had enough power even in 2012 to brute-force emulate the Wii, eliminating the main reason to keep with the [=PowerPC=] line - stuffed it into a Wii case and called it a day. Fortunately, Nintendo seems to have learned from this, basing the UsefulNotes/NintendoSwitch on an existing [=nVidia=] mobile chip which thus far has proven surprisingly capable of punching above its weight.

to:

* The UsefulNotes/{{Wii}} literally has no crash handler. So if you manage to crash your system, you open it up to Arbitrary Code Execution, and a whole load of security vulnerabilities await you. Do you have an SD card inserted? Well, crash any game that reads and writes to it and even ''more'' vulnerabilities open up. They'll tell you that they fixed these vulnerabilities though system updates, but in reality, they never did. In fact, the only thing these updates did on that matter was simply remove anything that was installed ''with'' these vulnerabilities - nothing's stopping you from using these vulnerabilities again to ''re''-install them. [[GoodBadBugs Of course, all All of this is a good thing if you like modding your console.]]
** While the UsefulNotes/WiiU ultimately proved a failure for several reasons, poor component choices helped contribute to its near-total lack of third-party support. It'd only be a slight exaggeration to say that the system's CPU was essentially just three heavily overclocked Wii [=CPUs=] -- the Wii's own CPU in turn being just a higher-clocked version of the CPU that had been used a decade earlier in the UsefulNotes/NintendoGameCube -- slapped together on the same die, with performance that was abysmally poor by 2012 standards. Its GPU, while not ''as'' slow, wasn't all that much faster than those of the [=PS3=] and Xbox 360,[[note]]in fact, on paper the Wii U's GPU was actually ''slower'' than the [=GPUs=] of either of those consoles, but due to advances in technology was about twice as fast in practice... unless you offloaded code to the GPU to offset the lack of CPU grunt, which would bring it back down to, if not below, [=PS3/360=] levels[[/note]] and used a shader model in-between those of the older consoles and their successors, meaning that ported [=PS3=]/360 games didn't take advantage of the newer hardware, while games designed for the [=PS4=] and Xbox One wouldn't even work to begin with due to the lack of necessary feature support. While Nintendo likely stuck with the [=PowerPC=] architecture for UsefulNotes/BackwardsCompatibility reasons, the system would likely have fared much better if Nintendo had just grabbed an off-the-shelf AMD laptop APU - which had enough power even in 2012 to brute-force emulate the Wii, eliminating the main reason to keep with the [=PowerPC=] line - stuffed it into a Wii case and called it a day. Fortunately, Nintendo seems to have learned from this, basing the UsefulNotes/NintendoSwitch on an existing [=nVidia=] mobile chip which thus far has proven surprisingly capable of punching above its weight.



** The UsefulNotes/SegaSaturn is, despite its admitted strong points on the player end, seen as one of the worst major consoles internally. It was originally intended to be the best 2D gaming system out there (which it was), so its design was basically just a 32X with higher clockspeeds, more memory, and CD storage. However, partway through development Sega learned of Sony's and Nintendo's upcoming systems (the [=PlayStation=] and [=Nintendo 64=] respectively) which were both designed with 3D games in mind, and realized the market - especially in their North America stronghold - was about to shift under their feet; they wouldn't have a prayer of competing. So, in an effort to try to bring more and more power to the console, Sega added an extra CPU and GPU to the system, which sounds great at first... until you consider that there were also ''six other processors'' that couldn't interface too well. This also made the motherboard prohibitively complex, being the most expensive console at the time. And lastly, much like the infamous Nvidia [=NV1=] which has its own example in this very page, the GPU worked on four-sided basic primitives while the industry standard was three sides, a significant hurdle for multiplatform games as those developed with triangular primitives would require extensive porting work to adapt them to quads. All this piled-on complication made development on the Saturn a nightmare. Ironically, consoles with multiple CPU cores would become commonplace two generations later with the Xbox 360 and [=PlayStation=] 3; like a lot of Sega's various other products of that era, they had attempted to push new features before game developers were really ready to make use of them.

to:

** The UsefulNotes/SegaSaturn is, despite its admitted strong points on the player end, seen as one of the worst major consoles internally. It was originally intended to be the best 2D gaming system out there (which it was), so its design was basically just a directly based on the 32X with higher clockspeeds, more memory, and CD storage. However, partway through development Sega learned of Sony's and Nintendo's upcoming systems (the [=PlayStation=] and [=Nintendo 64=] respectively) which were both designed with 3D games in mind, and realized the market - especially in their North America stronghold - was about to shift under their feet; they wouldn't have a prayer of competing. So, in an effort to try to bring more and more power to the console, Sega added an extra CPU and GPU to the system, which sounds great at first... until you consider that there were also ''six other processors'' that couldn't interface too well. This also made the motherboard prohibitively complex, being the most expensive console at the time. And lastly, much like the infamous Nvidia [=NV1=] which has its own example in this very page, the GPU worked on four-sided basic primitives while the industry standard was three sides, a significant hurdle for multiplatform games as those developed with triangular primitives would require extensive porting work to adapt them to quads. All this piled-on complication made development on the Saturn a nightmare. Ironically, consoles with multiple CPU cores would become commonplace two generations later with the Xbox 360 and [=PlayStation=] 3; like a lot of Sega's various other products of that era, they had attempted to push new features before game developers were really ready to make use of them.



*** Ironically, one aspect of the Dreamcast that is much worse than the Saturn is the system's security. The Saturn had a robust security system similar to the [[UsefulNotes/PlayStation Sony PlayStation]] that took decades to defeat,[[note]]The simplified explanation is that a "wobble groove" was read off the disc, and if it didn't see it, it didn't boot[[/note]] so when Sega was designing the Dreamcast's copy protection mechanism, they took what they learned from the Saturn and threw it in the garbage in favor of using a proprietary GD-ROM format (this same format was also being used on their arcade hardware at the time) to boot games from as the console's sole security. On paper, this seemed like a good idea, but there was one gaping hole in the Dreamcast's security system: the system can accept ''another'' of Sega's proprietary formats called MIL-CD, basically like an Enhanced CD but with Dreamcast-specific features. The format was a major flop with no [=MIL-CDs=] ever releasing outside of Japan, but pirates quickly figured out that MIL-CD had no hardware-level copy protection. Basically, a MIL-CD's boot data was scrambled from the factory, and the Dreamcast contained an "unscrambler" that would descramble the boot data into something readable and boot into the game. A Dreamcast SDK was all pirates needed to defeat this, and running pirated games on the Dreamcast was as easy as burning a cracked ISO onto a CD-R and putting it in the machine. Sega removed MIL-CD support on a late revision of the Dreamcast to combat this, but it was too late, and the Dreamcast would become the most pirated disc-based home console of all time.
*** On a lesser note, the Dreamcast could only read 100KB of save data from a single VMU at a time, with no room for expansion. Compared to the UsefulNotes/PlayStation2's gargantuan 8MB memory card, this was absolutely ''tiny''. They attempted to release a [=4X=] Memory Card, but it had to work around the Dreamcast's design flaw of only reading 100KB from a card at a time by separating the 400KB of space into four "pages". The major downside is that games couldn't be stored over multiple pages as it was essentially four memory cards in one, and some games wouldn't detect it or would outright ''[[GameBreakingBug crash]]'' when trying to read from it. You also couldn't copy game saves between pages without a second memory card, and the 4X Memory Card didn't support any VMU-specific features, as it lacked a screen and face buttons.

to:

*** Ironically, one aspect of the Dreamcast that is much worse than the Saturn is the system's security. The Saturn had a robust security system similar to the [[UsefulNotes/PlayStation Sony PlayStation]] that took decades to defeat,[[note]]The simplified explanation is that a "wobble groove" was read off the disc, and if it didn't see it, it didn't boot[[/note]] so when Sega was designing the Dreamcast's copy protection mechanism, they took what they learned from the Saturn and threw it in the garbage in favor of using a proprietary GD-ROM format (this same format was also being used on their arcade hardware at the time) to boot games from as the console's sole security. On paper, this seemed like a good idea, but there was one gaping hole in the Dreamcast's security system: the system can accept ''another'' of Sega's proprietary formats called MIL-CD, basically which was like an Enhanced CD but with Dreamcast-specific features. The format was a major flop with no [=MIL-CDs=] ever releasing outside of Japan, but pirates quickly figured out that MIL-CD had no hardware-level copy protection. Basically, a A MIL-CD's boot data was scrambled from the factory, and the Dreamcast contained an "unscrambler" that would descramble the boot data into something readable and boot into the game. A Dreamcast SDK was all pirates needed to defeat this, and running pirated games on the Dreamcast was as easy as burning a cracked ISO onto a CD-R and putting it in the machine. Sega removed MIL-CD support on a late revision of the Dreamcast to combat this, but it was too late, and the Dreamcast would become the most pirated disc-based home console of all time.
*** On a lesser note, the Dreamcast could only read 100KB of save data from a single VMU at a time, with no room for expansion. Compared to the UsefulNotes/PlayStation2's gargantuan 8MB memory card, this was absolutely ''tiny''. They attempted to release a [=4X=] Memory Card, but it had to work around the Dreamcast's design flaw of only reading 100KB from a card at a time by separating the 400KB of space into four "pages". The major downside is that games couldn't be stored over multiple pages as it was essentially four memory cards in one, and some games wouldn't detect it or would outright ''[[GameBreakingBug crash]]'' when trying to read from it. You also couldn't copy game saves between pages without a second memory card, and the 4X Memory Card didn't support any VMU-specific features, as it lacked a screen and face buttons.



** The original model of the PSP had buttons too close to the screen, so the Einsteins at Sony moved over the switch for the Square button without moving the location of the button itself. Thus every PSP had an unresponsive Square button that would also often stick. Note that the Square button is the second-most important face button on the controller, right before X; in other words, it's used constantly during the action in most games. Sony president Ken Kutaragi confirmed that ''this was intentional'', conflating this basic technical flaw with the concept of artistic expression. This is a real quote sourced by dozens of trusted publications. The man ''actually went there''.

to:

** The original model of the PSP had buttons too close to the screen, so the Einsteins at Sony moved over the switch for the Square button without moving the location of the button itself. Thus every PSP had an unresponsive Square button that would also often stick. Note that the The Square button is the second-most important face button on the controller, right before X; in other words, it's used constantly during the action in most games. Sony president Ken Kutaragi confirmed that ''this was intentional'', conflating this basic technical flaw with the concept of artistic expression. This is a real quote sourced by dozens of trusted publications. The man ''actually went there''.



** The usage of an optical disc format on the PSP can qualify. On paper, it made perfect sense to choose optical discs over cartridges because of the former's storage size advantages and relative low manufacturing expense, not to mention that it enabled the release of high-quality video on the PSP. After all, utilizing optical discs instead of cartridges had propelled the original UsefulNotes/PlayStation to much greater success than [[UsefulNotes/Nintendo64 its competition from Nintendo]], so why wouldn't the same thing happen again in the handheld market? However, in practice, optical discs quickly proved to be a poor fit for a handheld system: Sony's Universal Media Disc (UMD) was fragile, as there have been many cases of the outer plastic shell which protected the actual data disc cracking and breaking, rendering the disc useless until the user buys an aftermarket replacement shell. In addition, it wasn't uncommon for the PSP's UMD drive to fail due to wear of the moving parts. UMD also failed to catch on as a video format due to their proprietary technology making them more expensive to produce than other optical disc formats, and thus were priced higher than [=DVDs=]. There were also no devices besides the PSP that could play UMD movies, meaning that you were stuck watching your UMD movies on the PSP's small screen, and couldn't swap them with your friends unless they too had a PSP. This drove away consumers, who would rather purchase a portable DVD player and have access to a cheaper media library, while the more tech-savvy can rip their [=DVDs=] and put them on Memory Sticks to watch on PSP without a separate disc. In addition, [[LoadsAndLoadsOfLoading UMD load times were]] ''[[LoadsAndLoadsOfLoading long]]'', compared to that of a standard UsefulNotes/NintendoDS cartridge, which Sony themselves tried to fix by doubling the memory of the PSP in later models to use as a UMD cache. By 2009, Sony themselves were trying to phase out the UMD with the PSP Go, which did not have a UMD drive and relied on digital downloads from the [=PlayStation=] Store, but it was too late; most games were already released on UMD while very few were actually made available digitally, and so the Go blocked off a major portion of PSP games, which led to consumers ignoring the Go. Sony's desigion to use standard cartridges for the PSP's successor, the UsefulNotes/PlayStationVita, seemed like a tacit admission that using UMD on the PSP was a mistake.

to:

** The usage of an optical disc format on the PSP can qualify. On paper, it made perfect sense to choose optical discs over cartridges because of the former's storage size advantages and relative low manufacturing expense, not to mention that and it enabled the release of high-quality video on the PSP. After all, utilizing optical discs instead of cartridges had propelled the original UsefulNotes/PlayStation to much greater success than [[UsefulNotes/Nintendo64 its competition from Nintendo]], so why wouldn't there was no reason to assume the same thing wouldn't happen again in the handheld market? market. However, in practice, optical discs quickly proved to be a poor fit for a handheld system: Sony's Universal Media Disc (UMD) was fragile, as there have been many cases of the outer plastic shell which protected the actual data disc cracking and breaking, rendering the disc useless until the user buys an aftermarket replacement shell. In addition, it wasn't uncommon for the PSP's UMD drive to fail due to wear of the moving parts. UMD also failed to catch on as a video format due to their proprietary technology making them more expensive to produce than other optical disc formats, and thus were priced higher than [=DVDs=]. There were also no devices besides the PSP that could play UMD movies, meaning that you were stuck watching your UMD movies on the PSP's small screen, and couldn't swap them with your friends unless they too had a PSP. This drove away consumers, who would rather purchase a portable DVD player and have access to a cheaper media library, while the more tech-savvy can rip their [=DVDs=] and put them on Memory Sticks to watch on PSP without a separate disc. In addition, [[LoadsAndLoadsOfLoading UMD load times were]] ''[[LoadsAndLoadsOfLoading long]]'', compared to that of a standard UsefulNotes/NintendoDS cartridge, which Sony themselves tried to fix by doubling the memory of the PSP in later models to use as a UMD cache. By 2009, Sony themselves were trying to phase out the UMD with the PSP Go, which did not have a UMD drive and relied on digital downloads from the [=PlayStation=] Store, but it was too late; most games were already released on UMD while very few were actually made available digitally, and so the Go blocked off a major portion of PSP games, which led to consumers ignoring the Go. Sony's desigion to use standard cartridges for the PSP's successor, the UsefulNotes/PlayStationVita, seemed like a tacit admission that using UMD on the PSP was a mistake.



** Reliability issues aside, the [=PS3=]'s actual hardware wasn't exactly a winner either, and depending on who you ask, the [=PS3=] was either a well-designed if misunderstood piece of tech or had the worst internal console architecture since the UsefulNotes/SegaSaturn. Ken Kutaragi envisioned the [=PlayStation=] 3 as "a supercomputer for the home", and as a result Sony implemented their Cell Broadband Engine processor, co-developed by IBM and Toshiba for supercomputer applications, into the console. While this in theory made the console much more powerful than the Xbox 360, in practice this made the system exponentially more difficult to program for as the CPU was not designed with video games in mind. In layman's terms, it featured eight individually programmable "cores", one general-purpose, but the others much more specialized and had limited access to the rest of the system. Contrast to, say, the Xbox 360's Xenon processor, which used a much more conventional three-general purpose core architecture that was much easier to program, and that was exactly the [=PS3=]'s downfall from a hardware standpoint. The Cell processor's unconventional architecture meant that it was notoriously difficult to write efficient code for (for comparison, a program that only consists of a few lines of code could easily consist of hundreds of lines if converted to Cell code), and good luck rewriting code designed for conventional processors into code for the Cell processor, which explains why many multi-platform games ran better on the 360 but worse on [=PS3=]: many developers weren't so keen on spending development resources rewriting an entire game to run properly on the [=PS3=], so what they would do instead is run the game on the general-purpose core and ignore the rest, effectively using only a fraction of the system's power. While developers would later put out some visually stunning games for the system, Sony saw the writing on the wall that the industry had moved towards favoring ease of porting across platforms over individual power brought by architecture that is highly-bespoke but hard to work with, and Sony abandoned weird, proprietary chipsets in favor of off-the-shelf, easier to program for AMD processors for their [=PS4=] and onward.
** A common criticism of the UsefulNotes/PlaystationVita is that managing your games and saves is a tremendous hassle: for some reason, deleting a Vita game will also delete its save files, meaning that if you want to make room for a new game you'll have to kiss your progress goodbye. This can be circumvented by transferring the files to a PC or uploading them to the cloud, but the latter requires a [=PlayStation=] Plus subscription to use. One wonders why they don't allow you to simply keep the save file like the [=PS1=] and PSP games do. This is made all the more annoying by the Vita's notoriously small and overpriced proprietary memory cards (itself possibly based on Sony's failed Memory Stick Micro M2 format, not to be confused with M.2 solid state drives, as they have very similar form factors, but of course the M2 is not compatible with the Vita), which means that if you buy a lot of games in digital format, you probably won't be able to hold your whole collection at the same time, even if you shell out big money for a [=32GB=] (the biggest widely-available format, about $60) or [=64GB=] ([[NoExportForYou must be imported from Japan]], can cost over $100, and is reported to sometimes suffer issues such as slow loading, game crashes, and data loss) card.
** And speaking of the Vita, the usage of proprietary memory cards can count as this: rumor has it that the Vita was designed with SD cards in mind, but [[ExecutiveMeddling greedy executives forced Sony engineers to use a proprietary memory card format]], crippling the system's multimedia capabilities if the biggest memory card you can buy for the system is only [=32GB=] ([=64GB=] in Japan) when a sizable [=MP3=] library can easily take up ''half'' of that. Worse is that the Vita came with no memory card out of the box and had no flash memory, so any unsuspecting customer might be greeted with a useless slab of plastic until they shell out extra cash for a memory card. Sony's short-sighted greed over the memory cards is cited as one of the major contributing factors of the console's early demise. The PCH-2000 models (often nicknamed the PS Vita Slim) come with internal flash memory, but the damage was already done, not to mention that if you insert a memory card, you cannot use the flash memory.
*** Another negative side-effect of Sony using proprietary memory cards for the Vita is how user-unfriendly it is to store media on the console. Yes, the PSP used Sony's proprietary Memory Stick Duo, but at least it was widely adopted by Sony (and some third parties) outside the PSP and thus memory card readers for the Memory Stick Duo are readily available to this day, not to mention that with the PSP, you can plug it into a computer and simply drag and drop media files onto the system. The Vita doesn't do that: in what was possibly an effort to lock down the Vita in wake of the rampant hacking and piracy on the PSP (which may have also influenced the decision to use proprietary memory cards), Sony made it so that the Vita needed to interface with special software installed on the user's PC called the Content Manager Assistant to transfer data. In addition, the user needed to select a directory for the Vita to copy from and select which files to copy ''from the Vita'', which is much less convenient than simply dragging and dropping files directly onto the system like you would with a smartphone or the PSP. Also, you need to be logged into the [=PlayStation=] Network to do any of this. Finally, this was all [[ShaggyDogStory rendered moot]] when hackers found an exploit that allowed Vita users to install unauthorized software that enabled the running of homebrew to the system. This was accomplished via, you guessed it, [[HoistByHisOwnPetard the Content Manager Assistant.]]
** The way the Vita handles connecting to cellular networks. If implemented correctly, the Vita could've been wildly innovative in this regard. However, that's not what Sony did. Sony's biggest mistake was opting to have the Vita connect to 3G in a period where 4G had already overtaken 3G in popularity, meaning customers likely could not use their existing data plans to connect their Vitas to the internet over cellular. But that wasn't all: 3G connectivity was exclusive to AT&T subscribers, meaning that even if you were still on 3G, if you were subscribed to a carrier ''other'' than AT&T, you still had to purchase an entirely separate data plan just to connect your Vita to the network, which was $30 monthly for a mid-tier data limit. Even still, 3G functionality was extremely limited, only allowing the user to view messages, browse the internet, download games under 20MB, and play asynchronous (i.e. turn-based) games online. It was such a hassle for such a restricted feature (not to mention that many smartphones allowed users to tether Wi-Fi devices to connect to the cellular network, which while not as convenient, essentially allowed users to use their Vita as if it was connected to standard Wi-Fi) that it made it not worth the extra $50 for the 3G-enabled model. The implementation of 3G was so bad that Sony scrapped it altogether for the PS Vita Slim.

to:

** Reliability issues aside, the [=PS3=]'s actual hardware wasn't exactly a winner either, and depending on who you ask, the [=PS3=] was either a well-designed if misunderstood piece of tech or had the worst internal console architecture since the UsefulNotes/SegaSaturn. Ken Kutaragi envisioned the [=PlayStation=] 3 as "a supercomputer for the home", and as a result Sony implemented their Cell Broadband Engine processor, co-developed by IBM and Toshiba for supercomputer applications, into the console. While this in theory made the console much more powerful than the Xbox 360, in practice this made the system exponentially more difficult to program for as the CPU was not designed with video games in mind. In layman's terms, it featured eight individually programmable "cores", one general-purpose, but the others much more specialized and had limited access to the rest of the system. Contrast to, say, the Xbox 360's Xenon processor, which used a much more conventional three-general purpose core architecture that was much easier to program, and that was exactly the [=PS3=]'s downfall from a hardware standpoint. The Cell processor's unconventional architecture meant that it was notoriously difficult to write efficient code for (for comparison, a program that only consists of a few lines of code could easily consist of hundreds of lines if converted to Cell code), and good luck rewriting code designed for conventional processors into code for the Cell processor, which explains why many multi-platform games ran better on the 360 but worse on [=PS3=]: many developers weren't so keen on spending development resources rewriting an entire their game to run properly on the [=PS3=], so what they would do instead is run the game on the general-purpose core and ignore the rest, effectively using only a fraction of the system's power. While developers would later put out some visually stunning games for the system, Sony saw the writing on the wall that the industry had moved towards favoring ease of porting across platforms over individual power brought by architecture that is highly-bespoke but hard to work with, and Sony abandoned weird, proprietary chipsets in favor of off-the-shelf, easier to program for AMD processors for their [=PS4=] and onward.
** A common criticism of the UsefulNotes/PlaystationVita is that managing your games and saves is a tremendous hassle: for some reason, deleting a Vita game will also delete its save files, meaning that if you want to make room for a new game you'll have to kiss your progress goodbye. This can be circumvented by transferring the files to a PC or uploading them to the cloud, but the latter requires a [=PlayStation=] Plus subscription to use. One wonders why they don't allow you to simply keep the save file like the [=PS1=] and PSP games do. This is made all the more annoying by the Vita's notoriously small and overpriced proprietary memory cards (itself possibly based on Sony's failed Memory Stick Micro M2 format, not to be confused with M.2 solid state drives, as they have very similar form factors, but of course the M2 is not compatible with the Vita), which means that if you buy a lot of games in digital format, you probably won't be able to hold your whole collection at the same time, even if you shell out big money for a [=32GB=] (the biggest widely-available format, about $60) or [=64GB=] ([[NoExportForYou must be imported from Japan]], can cost over $100, and is reported to sometimes suffer issues such as slow loading, game crashes, and data loss) card.
** And speaking of the Vita, the usage of proprietary memory cards can count as this: rumor has it that the Vita was designed with SD cards in mind, but [[ExecutiveMeddling greedy executives forced Sony engineers to use a proprietary memory card format]], crippling the system's multimedia capabilities if the biggest memory card you can buy for the system is only [=32GB=] ([=64GB=] in Japan) when a sizable [=MP3=] library can easily take up ''half'' of that. Worse is that the Vita came with no memory card out of the box and had no flash memory, so any unsuspecting customer might be greeted with a useless slab of plastic until they shell out extra cash for a memory card. Sony's short-sighted greed over the memory cards is cited as one of the major contributing factors of the console's early demise. The PCH-2000 models (often nicknamed the PS Vita Slim) come with internal flash memory, but the damage was already done, not to mention that and if you insert a memory card, you cannot use the flash memory.
*** Another negative side-effect of Sony using proprietary memory cards for the Vita is how user-unfriendly it is to store media on the console. Yes, the PSP used Sony's proprietary Memory Stick Duo, but at least it was widely adopted by Sony (and some third parties) outside the PSP and thus memory card readers for the Memory Stick Duo are readily available to this day, not to mention that and with the PSP, you can plug it into a computer and simply drag and drop media files onto the system. The Vita doesn't do that: in what was possibly an effort to lock down the Vita in wake of the rampant hacking and piracy on the PSP (which may have also influenced the decision to use proprietary memory cards), Sony made it so that the Vita needed to interface with special software installed on the user's PC called the Content Manager Assistant to transfer data. In addition, the user needed to select a directory for the Vita to copy from and select which files to copy ''from the Vita'', which is much less convenient than simply dragging and dropping files directly onto the system like you would with a smartphone or the PSP. Also, you need to be logged into the [=PlayStation=] Network to do any of this. Finally, this was all [[ShaggyDogStory rendered moot]] when hackers found an exploit that allowed Vita users to install unauthorized software that enabled the running of homebrew to the system. This was accomplished via, you guessed it, [[HoistByHisOwnPetard the Content Manager Assistant.]]
** The way the Vita handles connecting to cellular networks. If implemented correctly, the Vita could've been wildly innovative in this regard. However, that's not what Sony did. Sony's biggest mistake was opting to have the Vita connect to 3G in a period where 4G had already overtaken 3G in popularity, meaning customers likely could not use their existing data plans to connect their Vitas to the internet over cellular. But that wasn't all: 3G connectivity was exclusive to AT&T subscribers, meaning that even if you were still on 3G, if you were subscribed to a carrier ''other'' than AT&T, you still had to purchase an entirely a separate data plan just to connect your Vita to the network, which was $30 monthly for a mid-tier data limit. Even still, 3G functionality was extremely limited, only allowing the user to view messages, browse the internet, download games under 20MB, and play asynchronous (i.e. turn-based) games online. It was such a hassle for such a restricted feature (not to mention that (and many smartphones allowed users to tether Wi-Fi devices to connect to the cellular network, which while not as convenient, essentially allowed users to use their Vita as if it was connected to standard Wi-Fi) that it made it not worth the extra $50 for the 3G-enabled model. The implementation of 3G was so bad that Sony scrapped it altogether for the PS Vita Slim.



** In what was an early precursor to always-online DRM, the N-Gage required users to be connected to a cellular network to even play games, making the system virtually useless if you didn't either transfer your existing plan over to the N-Gage or buy an entirely different cellular plan if you wanted to keep your old phone and have the N-Gage as a separate system. This was unfortunately the norm for many cell-phones at the time, which maybe it made sense for a regular cell-phone, but not for something that was being advertised as a competitor to the Game Boy Advance. Luckily, [[GoodBadBugs inserting a dummy SIM card can trick the N-Gage into thinking it's connected to a working cellular network and run games offline.]]

to:

** In what was an early precursor to always-online DRM, the N-Gage required users to be connected to a cellular network to even play games, making the system virtually useless if you didn't either transfer your existing plan over to the N-Gage or buy an entirely a different cellular plan if you wanted to keep your old phone and have the N-Gage as a separate system. This was unfortunately the norm for many cell-phones at the time, which maybe it made sense for a regular cell-phone, but not for something that was being advertised as a competitor to the Game Boy Advance. Luckily, [[GoodBadBugs inserting a dummy SIM card can trick the N-Gage into thinking it's connected to a working cellular network and run games offline.]]



* While it is fully in WhatCouldHaveBeen territory, Active Enterprises, the developers of the (in)famous ''Videogame/Action52'' had plans to develop a console, the [[https://bootleggames.fandom.com/wiki/Action_Gamemaster Action Gamemaster]], that besides its own cartridges would have been able to play cartridges of other consoles as well as CD-ROM games besides having a TV tuner. Now try to imagine the weight and autonomy of such a device with TheNineties technology, not to mention ergonomics. And then look at the concept art and realize how bizarrely proportioned everything is, that there's no obvious place to put cartridges, [=CDs=], or required accessories, and how vulnerable that screen is. The phrases "pipe dream" and "ahead of its time" barely even begin to describe this idea, and the fact [[PointyHairedBoss Active Enterprises head Vince Perri]] thought it would be anything other than AwesomeButImpractical just goes to show how overambitious and out of touch with not only the video game market but technology in general the man was.

to:

* While it is fully in WhatCouldHaveBeen territory, Active Enterprises, the developers of the (in)famous ''Videogame/Action52'' ''Videogame/Action52'', had plans to develop a console, the [[https://bootleggames.fandom.com/wiki/Action_Gamemaster Action Gamemaster]], that besides Gamemaster]]. Besides its own cartridges cartridges, it would have been able to play cartridges of other consoles as well as CD-ROM games besides having a TV tuner. Now Ergonomics aside, try to imagine the weight and autonomy of such a device with TheNineties technology, not to mention ergonomics.technology. And then look at the concept art and realize how bizarrely proportioned everything is, that there's no obvious place to put cartridges, [=CDs=], or required accessories, and how vulnerable that screen is. The phrases "pipe dream" and "ahead of its time" barely even begin to describe this idea, and the fact [[PointyHairedBoss Active Enterprises head Vince Perri]] thought it would be anything other than AwesomeButImpractical just goes to show how overambitious and out of touch with not only the video game market but technology in general the man was.



* The drums that initially came with ''VideoGame/RockBand 2'' had a battery casing that wasn't tight enough, which would cause the batteries to come loose if it was hit too hard. Naturally, this being a drum set, of course the set was going to be hit repeatedly and harshly. The proposed solution from the developers was to stuff a folded paper towel into the battery case along with the batteries and hope that made everything stay in place. Later versions of drum sets wouldn't have such a problem, but it leaves one to wonder how this got missed in playtesting.

to:

* The drums that initially came with ''VideoGame/RockBand 2'' had a battery casing that wasn't tight enough, which would cause the batteries to come loose if it was hit too hard. Naturally, Since this being was a drum set, of course the set it was going to be hit repeatedly and harshly. The proposed solution from the developers was to stuff a folded paper towel into the battery case along with the batteries and hope that made everything stay in place. Later versions of drum sets wouldn't have such a problem, but it leaves one to wonder how this got missed in playtesting.



*** Note that GPS isn't limited to either gold-colored plastics or the ''Transformers'' line. Ultra Magnus' original ''Diaclone'' toy (Powered Convoy, specifically the chrome version) had what is termed "Blue Plastic Syndrome" (which was thankfully fixed for the Ultra Magnus release, which uses white instead of blue), and over in the ''GI Joe'' line the original Serpentor figure had GPS in his waist.

to:

*** Note that GPS isn't limited to either gold-colored plastics or the ''Transformers'' line. Ultra Magnus' original ''Diaclone'' toy (Powered Convoy, specifically the chrome version) had what is termed "Blue Plastic Syndrome" (which was thankfully fixed for the Ultra Magnus release, which uses white instead of blue), and over in the ''GI Joe'' line the original Serpentor figure had GPS in his waist.



** Unposeable "brick" Transformers. The Generation 1 toys [[GrandfatherClause can get away with it]] (mainly because balljoints weren't widely used until ''Beast Wars'' a decade later - though they were used as early as 1985 on Astrotrain - and safety was more important that poseability), but in later series, like most ''Armada'' and some ''Energon'' toys (especially the Powerlinked modes), they are atrocious - ''Energon'' Wing Saber is {{literal|Metaphor}}ly a FlyingBrick, and not [[Franchise/{{Superman}} in the good way]]. With today's toy technology, there just isn't an excuse for something with all the poseability of a cinderblock and whose transformation basically consists of lying the figure down, especially the larger and more expensive ones. Toylines aimed at younger audiences (such as ''Rescue Bots'' and ''Robots in Disguise'' 2015) are a little more understandable, but for the lines aimed at general audiences or older fans (such as ''Generations''), it's inexcusable.

to:

** Unposeable "brick" Transformers. The Generation 1 toys [[GrandfatherClause can get away with it]] (mainly because balljoints weren't widely used until ''Beast Wars'' a decade later - though they were used as early as 1985 on Astrotrain - and safety was more important that poseability), but in later series, like most ''Armada'' and some ''Energon'' toys (especially the Powerlinked modes), they are atrocious - ''Energon'' Wing Saber is {{literal|Metaphor}}ly a FlyingBrick, and not [[Franchise/{{Superman}} in the good way]]. With today's toy technology, there just isn't an excuse for something with all the poseability of a cinderblock and whose transformation basically consists of little more than lying the figure down, especially the larger and more expensive ones. Toylines aimed at younger audiences (such as ''Rescue Bots'' and ''Robots in Disguise'' 2015) are a little more understandable, but for the lines aimed at general audiences or older fans (such as ''Generations''), it's inexcusable.



* Aqua Dots (Bindeez in its native Australia) is (or was) a fun little collection of interlocking beads designed for the creation of multidimensional shapes, as seen on TV. You had to get them wet before they would stick together, but the coating released one ingredient it shouldn't have when exposed to water - a form of the date-rape drug [[http://en.wikipedia.org/wiki/Gamma-Hydroxybutyric_acid GHB]]. Should someone put ''that'' in their mouths... This wasn't the fault of the company that made them, but rather the [[MadeInCountryX Chinese plant]] that manufactured the toys. Essentially, they found out that some chemical was much less expensive than the one they were supposed to be using, but still worked. They didn't do the research that said chemical metabolizes into GHB, or else they didn't care (and also didn't tell the company that they made the swap). And yet, for all the Chinese toy manufacturer chaos that was going on in the media at the time, the blame fell squarely on the toy company for this. They still exist, though thankfully with a non-GHB formulation. They were renamed to Pixos (Beados in Australia) and marketed as "safety tested". In fact, they were marketed the same way Aqua Dots were, with the same announcer and background music ([[https://www.youtube.com/watch?v=vf_9PbBrK34 compare]] and [[https://www.youtube.com/watch?v=_7r-L6YxAnM contrast]]). Now, they are marketed in America under the name of Beados.

to:

* Aqua Dots (Bindeez in its native Australia) is (or was) a fun little collection of interlocking beads designed for the creation of multidimensional shapes, as seen on TV. You had to get them wet before they would stick together, but the coating released one ingredient it shouldn't have when exposed to water - a form of the date-rape drug [[http://en.wikipedia.org/wiki/Gamma-Hydroxybutyric_acid GHB]]. Should someone put ''that'' in their mouths... This wasn't the fault of the company that made them, but rather the [[MadeInCountryX Chinese plant]] that manufactured the toys. Essentially, they They found out that some chemical was much less expensive than the one they were supposed to be using, but still worked. They didn't do the research that said chemical metabolizes into GHB, or else they didn't care (and also didn't tell the company that they made the swap). And yet, for all the Chinese toy manufacturer chaos that was going on in the media at the time, the blame fell squarely on the toy company for this. They still exist, though thankfully with a non-GHB formulation. They were renamed to Pixos (Beados in Australia) and marketed as "safety tested". In fact, they were marketed the same way Aqua Dots were, with the same announcer and background music ([[https://www.youtube.com/watch?v=vf_9PbBrK34 compare]] and [[https://www.youtube.com/watch?v=_7r-L6YxAnM contrast]]). Now, they are marketed in America under the name of Beados.



** ''Series/MyNameIsEarl'' had a minor character have a similar gun. Given that he also had a real gun... And take two guesses how said character wound up dead in a later episode.

to:

** ''Series/MyNameIsEarl'' had a minor character have a similar gun. Given that he also had a real gun... And take two guesses how said character wound up dead in a later episode.



* ''Film/TheDarkKnight'' tie-in "hidden blade" katana toy has a hard plastic, spring-loaded blade in the handle that shot out with such force that it could cause ''blunt force trauma'' if the kids weren't expecting it and that can be activated by an easily-hit trigger in the handle. Essentially, they were marketing an oversized blunt switchblade.

to:

* ''Film/TheDarkKnight'' tie-in "hidden blade" katana toy has a hard plastic, spring-loaded blade in the handle that shot out with such force that it could cause ''blunt force trauma'' if the kids weren't expecting it and that can be activated by an easily-hit trigger in the handle. Essentially, they They were marketing an oversized blunt switchblade.



* In a similar case to the ''Transformers'' GPS above, Toys/{{LEGO}} also fumbled up their own plastic around 2007, which resulted in nearly all of the lime-green pieces becoming ridiculously fragile. This affected the ''Toys/{{Bionicle}}'' sets of that era greatly, which were already prone to breaking due to the faulty sculpting of the ball-socket joints. Since that line of sets had more lime-colored pieces than usual, it is needless to say that fans were not amused with the ordeal, as it meant that they ''couldn't take apart and rebuild their LEGO sets''. Reportedly, some of these lime pieces broke right at the figures' first assembly.

to:

* In a similar case to the ''Transformers'' GPS above, Toys/{{LEGO}} also fumbled up their own plastic around 2007, which resulted in nearly all of the lime-green pieces becoming ridiculously fragile. This affected the ''Toys/{{Bionicle}}'' sets of that era greatly, which were already prone to breaking due to the faulty sculpting of the ball-socket joints. Since that line of sets had more lime-colored pieces than usual, it is needless to say that fans were not amused with the ordeal, as it meant that they ''couldn't take apart and rebuild their LEGO sets''. Reportedly, some of these lime pieces broke right at the figures' first assembly.



** Razor of all companies marketed, for a month or so in 2009, a ''scooter'' that came with its own spark generator.

to:

** Razor of all companies marketed, for a month or so in 2009, a ''scooter'' that came with its own spark generator.



* On August 6, 2005, an [[https://en.wikipedia.org/wiki/ATR_72 ATR-72]] (by no means a terrible aircraft) operating as [[https://en.wikipedia.org/wiki/Tuninter_Flight_1153 Tuninter Flight 1153]] unexpectedly ran out of fuel, and the pilots, in full OhCrap mode, fruitlessly tried to restart the engines rather than feathering because the fuel gauge incorrectly showed that there was plenty of fuel, leading to the plane ditching with 16 fatalities just short of an emergency landing in Sicily. Turns out, maintenance guys installed a fuel gauge designed for the far smaller ATR-''[[https://en.wikipedia.org/wiki/ATR_42 42]]'' because the correct gauge wasn't logged in the database properly, and said database also indicated that the 42 gauge could be used with the 72. Why was such a staggering failure with such tragic consequences even possible? Because ATR got lazy and didn't bother to change the fuel gauge design for the 72, resulting in two different non-interchangeable safety-critical modules being compatible with sockets in two totally different planes. For obvious reasons, the ANSV recommended that a redesign of the fuel gauge be mandated to remove this compatibility.

to:

* On August 6, 2005, an [[https://en.wikipedia.org/wiki/ATR_72 ATR-72]] (by no means a terrible aircraft) operating as [[https://en.wikipedia.org/wiki/Tuninter_Flight_1153 Tuninter Flight 1153]] unexpectedly ran out of fuel, and the pilots, in full OhCrap mode, fruitlessly tried to restart the engines rather than feathering because the fuel gauge incorrectly showed that there was plenty of fuel, leading to the plane ditching with 16 fatalities just short of an emergency landing in Sicily. Turns out, maintenance guys installed a fuel gauge designed for the far smaller ATR-''[[https://en.wikipedia.org/wiki/ATR_42 42]]'' because the correct gauge wasn't logged in the database properly, and said database also indicated that the 42 gauge could be used with the 72. Why was such a staggering failure with such tragic consequences even possible? Because ATR got lazy and didn't bother to change the fuel gauge design for the 72, resulting in two different non-interchangeable safety-critical modules being compatible with sockets in two totally different planes. For obvious reasons, the ANSV recommended that a redesign of the fuel gauge be mandated to remove this compatibility.



** If you have a Dodge Caravan with a central console, beware of laying credit-card-sized items on the rolling shutter that conceals the 12V outlets. They can easily slide in, even on later models like the late-2010 models where you'd think a design quirk like this would have been addressed much earlier. Drivers have reported losing credit cards this way when it was laid on the shutter closest to the dashboard. Fortunately, it's not too difficult to disassemble the center console partly to retrieve lost items, but one must still be careful of disturbing the wiring and its arguably a little too involved for what should be a more user-friendly retrieval.

to:

** If you have a Dodge Caravan with a central console, beware of laying credit-card-sized items on the rolling shutter that conceals the 12V outlets. They can easily slide in, even on later models like the late-2010 models where you'd think a design quirk like this would have been addressed much earlier. Drivers have reported losing credit cards this way when it was laid on the shutter closest to the dashboard. Fortunately, it's not too difficult to disassemble the center console partly to retrieve lost items, but one must still be careful of disturbing the wiring wiring, and its arguably it's a little too involved for what should be a more user-friendly retrieval.



** [[https://www.youtube.com/watch?v=g4StCCv8QW0 Testing]] by [=YouTube=] channel [=InRange TV=] wanted to test this alleged design flaw, using both (newer) American made and (older) German made parts. While there was a small amount of shifting in the grouping of shots fired from the German made parts, the ultimate conclusion is dumping a magazine's worth of ammo won't make the gun horribly inaccurate. They've concluded that the barrel caused the slight shifting, not the trunnion (the part that holds the barrel to the receiver) shifting or the receiver itself warping. And even then, the shifting isn't enough to make the rifle not combat effective at the ranges it needed to be effective. Also, a barrel made with modern manufacturing technologies and metallurgy pretty much renders the grouping problem moot.

to:

** [[https://www.youtube.com/watch?v=g4StCCv8QW0 Testing]] by [=YouTube=] channel [=InRange TV=] wanted to test this alleged design flaw, using both (newer) American made and (older) German made parts. While there was a small amount of shifting in the grouping of shots fired from the German made parts, the ultimate conclusion is dumping a magazine's worth of ammo won't make the gun horribly inaccurate. They've concluded that the barrel caused the slight shifting, not the trunnion (the part that holds the barrel to the receiver) shifting or the receiver itself warping. And even then, the shifting isn't enough to make the rifle not combat effective at the ranges it needed to be effective. Also, a A barrel made with modern manufacturing technologies and metallurgy pretty much renders will also render the grouping problem moot.



** A similar problem was repeated after UsefulNotes/WorldWarII with the T24, an experimental version of the MG 42 machine gun converted to .30-06. If they were custom-built for the round there probably wouldn't have been an issue, as the base design did prove to be relatively adaptable (as seen with other post-war variations downsized to 7.62 and 5.56mm NATO), but that's the problem - they weren't custom-built for the round, they were made out of existing machine guns with the bare minimum of newly-manufactured parts to convert them to the longer .30-06. Two prototypes were built and neither were acceptable, with one of the prototypes suffering 51 stoppages within 1,500 rounds.

to:

** A similar problem was repeated after After UsefulNotes/WorldWarII with came the T24, an experimental version of the MG 42 machine gun converted to .30-06. If they were custom-built for the round there probably wouldn't have been an issue, as the base design did prove to be relatively adaptable (as seen with other post-war variations downsized to 7.62 and 5.56mm NATO), but that's the problem - they weren't custom-built for the round, they were made out of existing machine guns with the bare minimum of newly-manufactured parts to convert them to the longer .30-06. Two prototypes were built and neither were acceptable, with one of the prototypes suffering 51 stoppages within 1,500 rounds.



* Linpus Linux Lite, as shipped with the Acer Aspire One. Now in fairness to Linpus, its GUI could not possibly be more intuitive (plus a boot time of just 20 seconds and recognizing out of the box xD picture cards as well as others beyond SD ones, if the computer has a reader that supports them), but there is a difference between designing a distro for complete beginners and designing a distro with several directories ''hard-coded'' to be read-only and Add/Remove Programs accessible only by some fairly complex command-line tinkering. That the sum total of its official documentation is a ten-page manual that contains no information that can't be figured out by an experienced user within five minutes of booting doesn't help as well as that updating it was a hell, plus ''long'' (several minutes) boot times if you did not turn the computer off properly.

* Xandros Linux, as used by Asus in the first few models of their [=EeePC=] line - arguably the distro that Linpus was developed to compete with - was ''a little'' better, but not all that much. Aside from the fact that big parts of it weren't actually open-source - notoriously a big no-no in the Linux world - you still needed command-line magic to switch it from its idiot-proof UI into something a non-newbie would want to use. The "advanced" UI required delving into cryptic text-based files for configuration, and community-made utilities (that may or may not have been competently programmed) were necessary to access some features. Compatibility with existing software was okay-ish at start but became spotty later on. Eventually Asus saw the writing on the wall and started phasing out Xandros entirely; they switched to shipping [=EeePCs=] with Windows exclusively and started neglecting the existing Xandros user base. Anyone trying to use Xandros past that point would boot into the computing equivalent of a ghost town, with no updates, aging repositories and security issues piling up. Eventually Xandros was retired, and common advice for people getting second-hand [=EeePCs=] was "wipe it immediately and install anything else".

to:

* Linpus Linux Lite, as shipped with the Acer Aspire One. Now in fairness to Linpus, its GUI could not possibly be more intuitive (plus a boot time of just 20 seconds and recognizing out of the box xD picture cards as well as others beyond SD ones, if the computer has a reader that supports them), but there is a difference between designing a distro for complete beginners and designing a distro with several directories ''hard-coded'' to be read-only and Add/Remove Programs accessible only by some fairly complex command-line tinkering. That the sum total of its official documentation is a ten-page manual that contains no information that can't be figured out by an experienced user within five minutes of booting doesn't help as well as that updating it was a hell, plus ''long'' (several minutes) boot times if you did not turn the computer off properly.

* Xandros Linux, as used by Asus in the first few models of their [=EeePC=] line - arguably one of the distro that distros Linpus was developed to compete with - was ''a little'' better, but not all that much. Aside from the fact that big parts of it weren't actually open-source - notoriously a big no-no in the Linux world - you still needed command-line magic to switch it from its idiot-proof UI into something a non-newbie would want to use. The "advanced" UI required delving into cryptic text-based files for configuration, and community-made utilities (that may or may not have been competently programmed) were necessary to access some features. Compatibility with existing software was okay-ish at start but became spotty later on. Eventually Asus saw the writing on the wall and started phasing out Xandros entirely; they switched to shipping [=EeePCs=] with Windows exclusively and started neglecting the existing Xandros user base. Anyone trying to use Xandros past that point would boot into the computing equivalent of a ghost town, with no updates, aging repositories and security issues piling up. Eventually Xandros was retired, and common advice for people getting second-hand [=EeePCs=] was "wipe it immediately and install anything else".



* The stainless steel teapots and coffee pots commonly found in British cafes and, notoriously, on British Rail trains: while they look like design classics in brushed steel, even the handles are made out of bare steel, which is very good at conducting the heat of two pints of liquid heated to boiling point. If you're lucky, the designer will have placed a thin layer of insulating foam between the pot and the handle, meaning that the handle will warm up, but not uncomfortably so; otherwise, you'll end up with a pot of tea that you cannot even lift until it cools to lukewarm. British comedian Creator/BenElton quoted this among other examples, and speculated that the British government has an entire department founded to butcher the designs of simple everyday tools.

to:

* The stainless steel teapots and coffee pots commonly found in British cafes and, notoriously, on British Rail trains: while they look like design classics in brushed steel, even the handles are made out of bare steel, which is very good at conducting the heat of two pints of liquid heated to boiling point. If you're lucky, the designer will have placed a thin layer of insulating foam between the pot and the handle, meaning that the handle will warm up, but not uncomfortably so; otherwise, you'll end up with a pot of tea that you cannot even lift until it cools to lukewarm. British comedian Creator/BenElton quoted this among other examples, and speculated that the British government has an entire a department founded to butcher the designs of simple everyday tools.



** By 2018, Samsung and Apple devices had decided to {{Bowdlerise}} the gun (🔫) for whatever reason. What looked like a realistic pistol on some devices (hell, it's even named that, and still has that design on the Unicode code charts) now looks like [[FamilyFriendlyFirearms a laser gun]] or a water gun, respectively. The same happened on Windows 10 desktops. Most applications will show a font-colored laser gun, while web browsers show a green water gun with orange "magazine". As [[https://blog.emojipedia.org/apple-and-the-gun-emoji/ this article shows]], this could go horribly wrong if, say, you were planning a water gun fight at the park, and used the emoji, making it look like you were planning a mass shooting to users of other devices. Eventually, by the end of 2018, other platforms such as Android, Windows, Twitter, and Facebook also changed their firearms into toy pistols, while the [=EmojiOne=] set put out both, with the former design being default and the latter optional.

to:

** By 2018, Samsung and Apple devices had decided to {{Bowdlerise}} the gun (🔫) for whatever reason. What looked like a realistic pistol on some devices (hell, it's (it's even named ''named'' that, and still has that design on the Unicode code charts) now looks like [[FamilyFriendlyFirearms a laser gun]] or a water gun, respectively. The same happened on Windows 10 desktops. Most applications will show a font-colored laser gun, while web browsers show a green water gun with orange "magazine". As [[https://blog.emojipedia.org/apple-and-the-gun-emoji/ this article shows]], this could go horribly wrong if, say, you were planning a water gun fight at the park, and used the emoji, making it look like you were planning a mass shooting to users of other devices. Eventually, by the end of 2018, other platforms such as Android, Windows, Twitter, and Facebook also changed their firearms into toy pistols, while the [=EmojiOne=] set put out both, with the former design being default and the latter optional.
Is there an issue? Send a MessageReason:
None


* Interactive Flight Technologies' Inflight Entertainment Network (or IFEN), installed in the Alitalia and--most infamously--Swissair fleets, was a pioneer of digital in-flight entertainment systems, but it was so rife with reliability concerns that it's astonishing it didn't end up a TrendKiller. IFEN was power-hungry, prone to overheating, heavy, unreliable, and [[NoOffButton couldn't easily be turned off by the flight crew in case of emergency]]. Qantas considered it but rejected it on the basis that the heat it produced kept corrupting the hard drives, and Alitalia were dissatisfied because they kept having to replace an average of three to five underseat units each flight. And then Swissair Flight 111 caught fire and crashed off the coast of Canada, and wet-arcing from the IFEN system was implicated as the initial cause of the fire, causing the reliability concerns to turn into ''[[FromBadToWorse safety concerns]]''; Swissair ended up disabling the system in their entire fleet less than two months after the crash. A year after the crash, the FAA banned IFEN on MD-11s due to the unsafe design and installation.

to:

* Interactive Flight Technologies' Inflight Entertainment Network (or IFEN), installed in the Alitalia and--most infamously--Swissair fleets, was a pioneer of digital in-flight entertainment systems, but it was so rife with reliability concerns that it's astonishing it didn't end up a TrendKiller.TrendKiller... especially given the fact it ended up '''[[https://en.wikipedia.org/wiki/Swissair_Flight_111 killing people]]'''. IFEN was power-hungry, prone to overheating, heavy, unreliable, and [[NoOffButton couldn't easily be turned off by the flight crew in case of emergency]]. Qantas considered it but rejected it on the basis that the heat it produced kept corrupting the hard drives, and Alitalia were dissatisfied because they kept having to replace an average of three to five underseat units each flight. And then a Swissair Flight 111 plane caught fire and crashed off the coast of Canada, and wet-arcing from the IFEN system was implicated as the initial cause of the fire, causing the reliability concerns to turn into ''[[FromBadToWorse safety concerns]]''; Swissair ended up disabling the system in their entire fleet less than two months after the crash. A year after the crash, the FAA banned IFEN on MD-11s due to the unsafe design and installation.
Is there an issue? Send a MessageReason:
None


* Interactive Flight Technologies' Inflight Entertainment Network (or IFEN), installed in the Alitalia and--most infamously--Swissair fleets, was a pioneer of digital in-flight entertainment systems, but it was so rife with reliability concerns that it's astonishing it didn't end up a TrendKiller. IFEN was power-hungry, prone to overheating, heavy, unreliable, and [[NoOffButton couldn't easily be turned off by the flight crew in case of emergency]]. Qantas considered it but rejected it on the basis that the heat it produced kept corrupting the hard drives, and Alitalia were dissatisfied because they kept having to replace an average of three to five underseat units each flight. And then Swissair Flight 111 caught fire and crashed off the coast of Canada, and wet-arcing from the IFEN system was implicated as a cause of the fire, [[FromBadToWorse causing reliability concerns to turn into safety concerns]]; Swissair ended up disabling the system in their entire fleet less than two months after the crash. A year after the crash, the FAA banned IFEN on MD-11s due to the unsafe design and installation.

to:

* Interactive Flight Technologies' Inflight Entertainment Network (or IFEN), installed in the Alitalia and--most infamously--Swissair fleets, was a pioneer of digital in-flight entertainment systems, but it was so rife with reliability concerns that it's astonishing it didn't end up a TrendKiller. IFEN was power-hungry, prone to overheating, heavy, unreliable, and [[NoOffButton couldn't easily be turned off by the flight crew in case of emergency]]. Qantas considered it but rejected it on the basis that the heat it produced kept corrupting the hard drives, and Alitalia were dissatisfied because they kept having to replace an average of three to five underseat units each flight. And then Swissair Flight 111 caught fire and crashed off the coast of Canada, and wet-arcing from the IFEN system was implicated as a the initial cause of the fire, [[FromBadToWorse causing the reliability concerns to turn into ''[[FromBadToWorse safety concerns]]; concerns]]''; Swissair ended up disabling the system in their entire fleet less than two months after the crash. A year after the crash, the FAA banned IFEN on MD-11s due to the unsafe design and installation.
Is there an issue? Send a MessageReason:
None


* Interactive Flight Technologies' Inflight Entertainment Network (or IFEN), installed in the Alitalia and--most infamously--Swissair fleets, was a pioneer of digital in-flight entertainment systems, but it was so rife with reliability concerns that it's astonishing it didn't end up a TrendKiller. IFEN was power-hungry, prone to overheating, heavy, unreliable, and [[NoOffButton couldn't easily be turned off by the flight crew in case of emergency]]. Qantas considered it but rejected it on the basis that the heat it produced kept corrupting the hard drives, and Alitalia were dissatisfied because they kept having to replace an average of three to five underseat units each flight. And then Swissair Flight 111 caught fire and crashed off the coast of Canada, and wet-arcing from the IFEN system was implicated as a cause of the fire, [[FromBadToWorse causing IFEN's perception to go from "unreliable" to "fire hazard"]]; Swissair ended up disabling the system in their entire fleet less than two months after the crash. A year after the crash, the FAA banned IFEN on MD-11s due to the unsafe design and installation.

to:

* Interactive Flight Technologies' Inflight Entertainment Network (or IFEN), installed in the Alitalia and--most infamously--Swissair fleets, was a pioneer of digital in-flight entertainment systems, but it was so rife with reliability concerns that it's astonishing it didn't end up a TrendKiller. IFEN was power-hungry, prone to overheating, heavy, unreliable, and [[NoOffButton couldn't easily be turned off by the flight crew in case of emergency]]. Qantas considered it but rejected it on the basis that the heat it produced kept corrupting the hard drives, and Alitalia were dissatisfied because they kept having to replace an average of three to five underseat units each flight. And then Swissair Flight 111 caught fire and crashed off the coast of Canada, and wet-arcing from the IFEN system was implicated as a cause of the fire, [[FromBadToWorse causing IFEN's perception reliability concerns to go from "unreliable" to "fire hazard"]]; turn into safety concerns]]; Swissair ended up disabling the system in their entire fleet less than two months after the crash. A year after the crash, the FAA banned IFEN on MD-11s due to the unsafe design and installation.
Is there an issue? Send a MessageReason:
None


* The initial cabinet for the original ''VideoGame/StreetFighterI'' had two giant pressure sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them. Putting aside the stiffness of the combat itself, repeatedly having to slam your knuckle down on big rubber buttons was a good way to make it very sore very quickly. And that was just if you were consistently hitting the buttons; more often than not, players would end up damaging their hands, or the machine, or both, making the cabinet an absolute maintenance nightmare. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.

to:

* The initial original cabinet for the original first ''VideoGame/StreetFighterI'' had two giant pressure sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them. Putting aside the stiffness of the combat itself, repeatedly having to slam your knuckle fist down on big rubber buttons was a good way to make it very sore very quickly. And that was just if you were consistently hitting the buttons; more often than not, players would end up missing them and damaging their hands, or the machine, or both, making the cabinet an absolute maintenance nightmare. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

-->'''[[https://youtu.be/7p4bteKOJiE&t=16m36s Professor Thorgi:]]''' ''Street Fighter 1'' is the only fighting game in history that actually fought back.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* The initial cabinet for the original ''VideoGame/StreetFighterI'' had two giant pressure sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them. Putting aside the stiffness of the combat itself, repeatedly having to slam your knuckle down on big rubber buttons was a good way to make it very sore very quickly. And that was just if you were consistently hitting the buttons; more often than not, players would end up damaging their hands, or the machine, or both, making the cabinet an absolute maintenance nightmare. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* An ongoing problem for Intel and AMD is what default settings motherboard manufacters use with regards to performancing boosting and how they build the circuitry to support higher power loads. For instance:
** Intel specifies in their more recent processors two power levels: a short duration, high power boost and an infinite duration, lower power boost. The duration of the high power boost is configurable, meaning some motherboard manufacturers simply set that value to effectively infinity, causing the CPU to try and boost as hard as possible longer than it was designed to do.
** When AMD's Ryzen [=7800X3D=] processors were literally blowing up, analysis suggested that motherboard manufacturers were, with varying degrees, partly to blame. One manufacturer in particular was found to have an overcurrent protection chip (which would've helped prevent the CPU from blowing up) was set too high for too long, and this was on a board approaching $1000.

Added: 1122

Removed: 1105

Is there an issue? Send a MessageReason:
None


* Interactive Flight Technologies' Inflight Entertainment Network (or IFEN), installed in the Alitalia and--most infamously--Swissair fleets, was a pioneer of digital in-flight entertainment systems, but it was so rife with reliability concerns that it's astonishing it didn't end up a TrendKiller. IFEN was power-hungry, hot, heavy, unreliable, and [[NoOffButton couldn't easily be turned off by the flight crew in case of emergency]]. Qantas considered it but rejected it on the basis that the heat it produced kept corrupting the hard drives, and Alitalia were dissatisfied because they kept having to replace an average of three to five underseat units each flight. And then Swissair Flight 111 caught fire and crashed off the coast of Canada, and wet-arcing from the IFEN system was implicated as a cause of the fire, [[FromBadToWorse causing IFEN's perception to go from "unreliable" to "fire hazard"]]; Swissair ended up disabling the system in their entire fleet less than two months after the crash. A year after the crash, the FAA banned IFEN on MD-11s due to the unsafe design and installation.


Added DiffLines:

* Interactive Flight Technologies' Inflight Entertainment Network (or IFEN), installed in the Alitalia and--most infamously--Swissair fleets, was a pioneer of digital in-flight entertainment systems, but it was so rife with reliability concerns that it's astonishing it didn't end up a TrendKiller. IFEN was power-hungry, prone to overheating, heavy, unreliable, and [[NoOffButton couldn't easily be turned off by the flight crew in case of emergency]]. Qantas considered it but rejected it on the basis that the heat it produced kept corrupting the hard drives, and Alitalia were dissatisfied because they kept having to replace an average of three to five underseat units each flight. And then Swissair Flight 111 caught fire and crashed off the coast of Canada, and wet-arcing from the IFEN system was implicated as a cause of the fire, [[FromBadToWorse causing IFEN's perception to go from "unreliable" to "fire hazard"]]; Swissair ended up disabling the system in their entire fleet less than two months after the crash. A year after the crash, the FAA banned IFEN on MD-11s due to the unsafe design and installation.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Interactive Flight Technologies' Inflight Entertainment Network (or IFEN), installed in the Alitalia and--most infamously--Swissair fleets, was a pioneer of digital in-flight entertainment systems, but it was so rife with reliability concerns that it's astonishing it didn't end up a TrendKiller. IFEN was power-hungry, hot, heavy, unreliable, and [[NoOffButton couldn't easily be turned off by the flight crew in case of emergency]]. Qantas considered it but rejected it on the basis that the heat it produced kept corrupting the hard drives, and Alitalia were dissatisfied because they kept having to replace an average of three to five underseat units each flight. And then Swissair Flight 111 caught fire and crashed off the coast of Canada, and wet-arcing from the IFEN system was implicated as a cause of the fire, [[FromBadToWorse causing IFEN's perception to go from "unreliable" to "fire hazard"]]; Swissair ended up disabling the system in their entire fleet less than two months after the crash. A year after the crash, the FAA banned IFEN on MD-11s due to the unsafe design and installation.
Is there an issue? Send a MessageReason:
Thankfully this issue was fixed in a redesign.


* Telenor's email service has a "report spam" button that deletes the selected message(s) in addition to reporting it/them as spam. The problem is that there's no confirmation before it irreversibly deletes the message, and the button is just below the "move email" button. You'd better not misclick while moving something important!


to:

* A previous version of Telenor's email service has had a "report spam" button that deletes deleted the selected message(s) in addition to reporting it/them as spam. The problem is that there's there was no confirmation before it irreversibly deletes deleted the message, message(s), and the button is just was right below the "move email" button. You'd better not misclick while moving something important!

Is there an issue? Send a MessageReason:
None


** One particular design oddity is that the USB 3.0 port located on the back of the dock is limited by software to USB 2.0 speeds. Rumors of a patch to "unlock" the port to USB 3.0 speeds circulated but no such patch has been released. It turns out that USB 3.0 can interfere with [=2.4GHz=] wireless transmitters if not properly shielded and if placed too close to the transmitter, as was the case with the USB 3.0 port on the Nintendo Switch dock. Nintendo likely noticed the problem during QA, and instead of replacing it with a USB 2.0 port or even investing money into properly shielding the circuit board in the dock, Nintendo simply decided to disable USB 3.0 functionality altogether. Indeed, hacking in USB 3.0 support causes issues with wireless controller functionality, and the port was replaced with an ethernet port for the OLED model's dock (HEG-007).

to:

** One particular design oddity is that the USB 3.0 port located on the back of the dock is limited by software to USB 2.0 speeds. Rumors of a patch to "unlock" the port to USB 3.0 speeds circulated but no such patch has been released. It turns out that USB 3.0 can interfere with [=2.4GHz=] wireless transmitters if not properly shielded and if placed too close to the transmitter, as was the case with the USB 3.0 port on the Nintendo Switch dock. Nintendo likely noticed the problem during QA, and instead of replacing it with a USB 2.0 port or even investing money into properly shielding the circuit board in the dock, Nintendo simply decided to disable USB 3.0 functionality altogether. Indeed, hacking in USB 3.0 support causes issues with wireless controller functionality, and the port was replaced with an ethernet port (which ''still'' runs through a USB 2.0 bus) for the OLED model's dock (HEG-007).
Is there an issue? Send a MessageReason:
A Date With Rosie Palms is no longer a trope


* The infamous Franchise/HarryPotter-themed Nimbus 2000 toy broomstick. Needless to say, there are [[DidntThinkThisThrough certain issues]] with a toy that vibrates and is designed to be "ridden" between the legs. At least one of the intended customers found that their older sister developed [[ADateWithRosiePalms a remarkable interest in it]].

to:

* The infamous Franchise/HarryPotter-themed Nimbus 2000 toy broomstick. Needless to say, there are [[DidntThinkThisThrough certain issues]] with a toy that vibrates and is designed to be "ridden" between the legs. At least one of the intended customers found that [[DoesThisRemindYouOfAnything their older sister developed [[ADateWithRosiePalms a remarkable interest in it]].
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Xandros Linux, as used by Asus in the first few models of their [=EeePC=] line - arguably the distro that Linpus was developed to compete with - was ''a little'' better, but not all that much. Aside from the fact that big parts of it weren't actually open-source - notoriously a big no-no in the Linux world - you still needed command-line magic to switch it from its idiot-proof UI into something a non-newbie would want to use. The "advanced" UI required delving into cryptic text-based files for configuration, and community-made utilities (that may or may not have been competently programmed) were necessary to access some features. Compatibility with existing software was okay-ish at start but became spotty later on. Eventually Asus saw the writing on the wall and started phasing out Xandros entirely; they switched to shipping [=EeePCs=] with Windows exclusively and started neglecting the existing Xandros user base. Anyone trying to use Xandros past that point would boot into the computing equivalent of a ghost town, with no updates, aging repositories and security issues piling up. Eventually Xandros was retired, and common advice for people getting second-hand [=EeePCs=] was "wipe it immediately and install anything else".
Is there an issue? Send a MessageReason:
None


*** One of the most commonly praised aspects of the Sega Saturn was its controller. The Sega Dreamcast pad, in what was certainly an additional sore spot for anyone scorned by the Saturn's early cancellation, was a disappointment. The overall idea wasn't a bad one: an evolution of the Saturn 3d pad with a slot for its own PocketStation style Memory Card module, the VMU. But alas, they bungled it up with [[TooManyCooksSpoilTheSoup some questionable choices]] that were considered serious steps back. The DC pad had two fewer buttons than the Saturn or any of the other pads at the time; [[https://www.segasaturnshiro.com/2022/06/03/interview-kenji-tosaki-talks-saturn-dreamcast-peripheral-design/ according to Kenji Tosaki]], two of the face buttons were removed at the request of developers to simplify game control, with executives and marketing following suit, reasoning that fighting game enthusiasts would purchase arcade sticks.[[note]]According to Tosaki, the official stick sold quite well.[[/note]] Any game needing more inputs would need to make do with the analog shoulder triggers, which due to the travel length, were less than ideal for any game not making full use of them. The d-pad was a cross pad that protruded from the curvature of the rest of the controller, with edges that grated on thumbs from extended use, particularly from fighters, in what was a step back from the Saturn's legendary circular pad. All of this might be somewhat forgivable if the controller was easy to handle, but unfortunately, the DC pad was similar in size and shape to the XBox's gargantuan-sized original "Duke" pad, and worse, the cord, possibly due to the required empty space for the VMU slot, protruded from the rear of the controller rather than the front, reducing available cord length by 6 inches and increasing the likelihood of accidentally yanking the console. [[DidntThinkThisThrough Whoops.]] It's telling that while the Saturn was considered a comparative misstep, and even after being VindicatedByHistory has had less of its legacy revisited due to the less port-friendly nature of the system as well as [[NoExportForYou the low localization ratio of games]] making re-releases few and far between, the Saturn pad, now seen as the ultimate expression of 2d game control, has been officially recreated ever since the [=PS2=] era, with the Dreamcast pad as something of an evolutionary dead end.

to:

*** One of the most commonly praised aspects of the Sega Saturn was its controller. The Sega Dreamcast pad, in what was certainly an additional sore spot for anyone scorned by the Saturn's early cancellation, was a disappointment. The overall idea wasn't a bad one: an evolution of the Saturn 3d pad with a slot for its own PocketStation style Memory Card module, the VMU. But alas, they bungled it up with [[TooManyCooksSpoilTheSoup some questionable choices]] that were considered serious steps back. The DC pad had two fewer buttons than the Saturn or any of the other pads at the time; [[https://www.segasaturnshiro.com/2022/06/03/interview-kenji-tosaki-talks-saturn-dreamcast-peripheral-design/ according to Kenji Tosaki]], two of the face buttons were removed at the request of developers to simplify game control, with executives and marketing following suit, reasoning that fighting game enthusiasts would purchase arcade sticks.[[note]]According to Tosaki, the official stick sold quite well.[[/note]] Any game needing more inputs would need to make do with the analog shoulder triggers, which due to the travel length, were less than ideal for any game not making full use of them. The d-pad was a cross pad that protruded from the curvature of the rest of the controller, with edges that grated on thumbs from extended use, particularly from fighters, in what was a step back from the Saturn's legendary circular pad. All of this might be somewhat forgivable if the controller was easy to handle, but unfortunately, the DC pad was similar in size and shape to the XBox's Xbox's gargantuan-sized original "Duke" pad, and worse, the cord, possibly due to the required empty space for the VMU slot, protruded from the rear of the controller rather than the front, reducing available cord length by 6 inches and increasing the likelihood of accidentally yanking the console. [[DidntThinkThisThrough Whoops.]] It's telling that while the Saturn was considered a comparative misstep, and even after being VindicatedByHistory has had less of its legacy revisited due to the less port-friendly nature of the system as well as [[NoExportForYou the low localization ratio of games]] making re-releases few and far between, the Saturn pad, now seen as the ultimate expression of 2d game control, has been officially recreated ever since the [=PS2=] era, with the Dreamcast pad as something of an evolutionary dead end.
Is there an issue? Send a MessageReason:
None


*** Finally, there was the inclusion of the Motorola 68000 CPU. It was intended to manage the functions of the "Tom" and "Jerry" chips, but since it just so happened to be the exact same chip used in the UsefulNotes/SegaGenesis, developers were more familiar with it as opposed to the poorly documented "Tom" and "Jerry" chips, and chose to use the 68000 as the system's main CPU instead of bothering to figure out the "Tom" and "Jerry" chips. The end result of all of this was a very difficult and cumbersome system to program for that was technically underwhelming, "64-bit"[[note]]We say 64-bit in quotes because it's debatable if the Jaguar was even a 64-bit console in the first place. The machine had a 64-bit object processor and blitter, but the two "Tom" and "Jerry" processors were 32-bit, meaning that all calculations were 32-bit, so most of the claims of being 64-bit were from Atari thinking two 32-bit processors "mathed up" to being a 64-bit system. [[UsefulNotes/HowVideoGameSpecsWork Not that it would have a meaningful impact on the system's graphical fidelity anyway.]][[/note]] capabilities be damned.
*** The controller only included three main action buttons, a configuration which was already causing issues for the UsefulNotes/SegaGenesis at the time. In a baffling move, the controller also featured a numeric keypad, something that Atari had last done on the 5200. On that occasion the keypad was pretty superfluous and generally ignored by developers, but it was only taking up what would probably have been unused space on the controller, so it didn't do any harm by being there. The Jaguar's keypad, on the other hand, was far bigger, turning the controller into an ungodly monstrosity that has often been ranked as the absolute worst videogame controller of all-time.[[note]]Its main competition coming ironically from the 5200 controller, though that one's more often given a pass on the grounds that it would have been decent if Atari hadn't cut so many corners, and that by 1993 the company definitely should have known better.[[/note]] Atari later saw sense and produced a revised controller that added in three more command buttons and shoulder buttons, but for compatibility reasons they couldn't ditch the keypad, meaning that the newer version was similarly uncomfortable[[note]]The 5 new buttons were actually just remaps of 5 of the keypad buttons[[/note]]. Note that the Jaguar's controller was in fact designed originally for the Atari Panther, their unreleased 32-bit console that was scheduled to come out in 1991 before it became obvious that the Genesis' 3-button configuration wasn't very future-proof. They evidently figured that the keypad gave them more than enough buttons and didn't bother creating a new controller for the Jaguar, a decision that would prove costly.

to:

*** Finally, there was the inclusion of the Motorola 68000 CPU. It was intended to manage the functions of the "Tom" and "Jerry" chips, but since it just so happened to be the exact same chip used in the UsefulNotes/SegaGenesis, developers were more familiar with it as opposed to the poorly documented "Tom" and "Jerry" chips, and chose to use the 68000 as the system's main CPU instead of bothering to figure out how to balance the functions of the "Tom" and "Jerry" chips. The end result of all of this was a very difficult and cumbersome system to program for that was technically underwhelming, "64-bit"[[note]]We say 64-bit in quotes because it's debatable if the Jaguar was even a 64-bit console in the first place. The machine had a 64-bit object processor and blitter, but the two "Tom" and "Jerry" processors were 32-bit, meaning that all calculations were 32-bit, so most of the claims of being 64-bit were from Atari thinking two 32-bit processors "mathed up" to being a 64-bit system. [[UsefulNotes/HowVideoGameSpecsWork Not that it would have a meaningful impact on the system's graphical fidelity anyway.]][[/note]] capabilities be damned.
*** The controller only included three main action buttons, a configuration which was already causing issues for the UsefulNotes/SegaGenesis at the time. In a baffling move, the controller also featured a numeric keypad, something that Atari had last done on the 5200. On that occasion the keypad was pretty superfluous and generally ignored by developers, but it was only taking up what would probably have been unused space on the controller, so it didn't do any harm by being there. The Jaguar's keypad, on the other hand, was far bigger, turning the controller into an ungodly monstrosity that has often been ranked as the absolute worst videogame controller of all-time.[[note]]Its main competition coming ironically from the 5200 controller, though that one's more often given a pass on the grounds that it would have been decent if Atari hadn't cut so many corners, and that by 1993 the company definitely should have known better.[[/note]] Atari later saw sense and produced a revised controller that added in three more command buttons and shoulder buttons, but for compatibility reasons they couldn't ditch the keypad, keypad - in fact, the five new buttons were just remaps of five of the keypad buttons - meaning that the newer version was similarly uncomfortable[[note]]The 5 new buttons were actually just remaps of 5 of the keypad buttons[[/note]].uncomfortable. Note that the Jaguar's controller was in fact designed originally for the Atari Panther, their unreleased 32-bit console that was scheduled to come out in 1991 before it became obvious that the Genesis' 3-button configuration wasn't very future-proof. They evidently figured that the keypad gave them more than enough buttons and didn't bother creating a new controller for the Jaguar, a decision that would prove costly.
Is there an issue? Send a MessageReason:
None


** Later in the system's life, it became abundantly clear that Philips did ''not'' design the CD-i with video games in mind due to lacking several game-specific hardware features like sprite-scaling and sprite-rotation, things that the Super Nintendo and Sega Genesis had to an extent (and were fully capable of with an additional graphics co-processor, and the latter system could push very primative, untextured polygons without any help). The CD-i was more or less designed for basic interactive software, such as point-and-click edutainment games, so when Philips tried to shift the focus of the system to full-fledged video games such as the ill-fated ''VideoGame/HotelMario'' and the ''[[VideoGame/TheLegendOfZeldaCDiGames Zelda]]'' [[VideoGame/TheLegendOfZeldaCDiGames CD-i trilogy]], these games already looked dated and primative compared to games such as ''VideoGame/DonkeyKongCountry'' and ''VideoGame/StarFox''. Any possible opportunities to rectify this via hardware revisions were stonewalled by Philips' CD-i Green Book standard (which was both a data specification ''and'' a hardware specification), who wouldn't budge on system specs, prioritizing wide compatibility over keeping their hardware up to date. This doomed the CD-i as a video game platform, especially as it was being supplanted by more powerful machines such as the [[UsefulNotes/ThreeDOInteractiveMultiplayer 3DO]] (which could do everything the CD-i could do except better), the UsefulNotes/SegaSaturn, the UsefulNotes/Nintendo64, and most damningly, the Creator/{{Sony}} UsefulNotes/PlayStation. Video games aside, this also ensured that the CD-i could not keep up with the [[TechnologyMarchesOn rapidly evolving technology of the era]], leaving it in the dust of more capable media appliances such as the DVD player.

to:

** Later in the system's life, it became abundantly clear that Philips did ''not'' design the CD-i with video games in mind due to lacking several game-specific hardware features like sprite-scaling and sprite-rotation, things that the Super Nintendo and Sega Genesis had to an extent (and were fully capable of with an additional graphics co-processor, and the latter system could push very primative, primitive, untextured polygons without any help). The CD-i was more or less designed for basic interactive software, such as point-and-click edutainment games, so when Philips tried to shift the focus of the system to full-fledged video games such as the ill-fated ''VideoGame/HotelMario'' and the ''[[VideoGame/TheLegendOfZeldaCDiGames Zelda]]'' [[VideoGame/TheLegendOfZeldaCDiGames CD-i trilogy]], trilogy, these games already looked dated and primative primitive compared to games such as ''VideoGame/DonkeyKongCountry'' and ''VideoGame/StarFox''. Any possible opportunities to rectify this via hardware revisions were stonewalled by Philips' CD-i Green Book standard (which was both a data specification ''and'' a hardware specification), who wouldn't budge on system specs, prioritizing wide compatibility over keeping their hardware up to date. This doomed the CD-i as a video game platform, especially as it was being supplanted by more powerful machines such as the [[UsefulNotes/ThreeDOInteractiveMultiplayer 3DO]] (which could do everything the CD-i could do except better), the UsefulNotes/SegaSaturn, the UsefulNotes/Nintendo64, and most damningly, the Creator/{{Sony}} UsefulNotes/PlayStation. Video games aside, this also ensured that the CD-i could not keep up with the [[TechnologyMarchesOn rapidly evolving technology of the era]], leaving it in the dust of more capable media appliances such as the DVD player.

Top