Follow TV Tropes

Following

History DarthWiki / IdiotDesign

Go To

OR

Changed: 135

Is there an issue? Send a MessageReason:
None


* Depending on what perspective you take on this, modern desktop hardware is approaching this in terms of the performance you get for the power they consume. For instance, [=CPUs=] from both Intel and AMD now hit the thermal limit first, allowing the CPU to draw much more power than previous designs ever did without deliberate mucking around in the firmware settings. Granted, for most scenarios power draw safety is mostly about averages rather than what's being pulled at that instant. However, it seems kind of silly that for most of these processors, about the last 5-10% performance they get requires at least another 20% more power, if that. A cursory search on the internet for articles on power limiting parts tends to show that people can reduce the power consumption on the part by about 20-30% but only lose 5-10%. Essentially: desktop power parts are being designed like high-end sports cars.

to:

* Depending on what perspective you take on this, modern desktop hardware is approaching this in terms of the performance you get for the power they consume. For instance, [=CPUs=] from both Intel and AMD now hit the thermal limit first, allowing the CPU to draw much more power than previous designs ever did without deliberate mucking around in the firmware settings. Granted, for most scenarios power draw safety is mostly about averages rather than what's being pulled at that instant. However, it seems kind of silly that for most of these processors, about the last 5-10% performance they get requires at least another 20% more power, if that. A cursory search on the internet for articles on power limiting parts tends to show that people can reduce the power consumption on the part by about 20-30% but only lose 5-10%.5-10% performance which tends to be only noticeable if it's being measured (i.e., you're not going to notice a performance drop in practice). Essentially: desktop power parts are being designed like high-end sports cars.

Added: 907

Changed: 154

Is there an issue? Send a MessageReason:
None


* The initial batch of the RX 480 graphics cards had a total board power target of 150W, 75 of which could be sourced from the slot itself which is part of the PCI-Express spec. The problem came about when some of the cards began drawing more power than the spec allowed due to board manufacturers factory overclocking the things. This caused in some cases, cards to burn the slot out in motherboards that weren't overbuilt. A driver fix came out later to limit how much power the card uses to avoid drawing too much power from the slot.

to:

* The initial batch of the RX 480 graphics cards had a total board power target of 150W, 75 of which could be sourced from the slot itself which is part of the PCI-Express spec.spec [[note]]To be super technical, the GPU primarily uses 12V power, of which only 66W can be sourced from the slot. The rest is from the 3.3V rail.[[/note]]. The problem came about when some of the cards began drawing more power than the spec allowed due to board manufacturers factory overclocking the things. This caused in some cases, cards to burn the slot out in motherboards that weren't overbuilt. A driver fix came out later to limit how much power the card uses to avoid drawing too much power from the slot.


Added DiffLines:

* Depending on what perspective you take on this, modern desktop hardware is approaching this in terms of the performance you get for the power they consume. For instance, [=CPUs=] from both Intel and AMD now hit the thermal limit first, allowing the CPU to draw much more power than previous designs ever did without deliberate mucking around in the firmware settings. Granted, for most scenarios power draw safety is mostly about averages rather than what's being pulled at that instant. However, it seems kind of silly that for most of these processors, about the last 5-10% performance they get requires at least another 20% more power, if that. A cursory search on the internet for articles on power limiting parts tends to show that people can reduce the power consumption on the part by about 20-30% but only lose 5-10%. Essentially: desktop power parts are being designed like high-end sports cars.
Is there an issue? Send a MessageReason:
HP doesn't have a page as they don't produce tropable works.


* {{UsefulNotes/HP}} keyboards and mice come with a somewhat jarring and inexcusable problem that causes the former to glitch (e.g. delay or not register keypresses or releases) and the latter to judder. The company really can't catch a break.

to:

* {{UsefulNotes/HP}} HP keyboards and mice come with a somewhat jarring and inexcusable problem that causes the former to glitch (e.g. delay or not register keypresses or releases) and the latter to judder.jitter. The company really can't catch a break.
Is there an issue? Send a MessageReason:
None


* [[UsefulNotes/HP HP]] keyboards and mice come with a somewhat jarring and inexcusable problem that causes the former to glitch (e.g. delay or not register keypresses or releases) and the latter to judder. The company really can't catch a break.

to:

* [[UsefulNotes/HP HP]] {{UsefulNotes/HP}} keyboards and mice come with a somewhat jarring and inexcusable problem that causes the former to glitch (e.g. delay or not register keypresses or releases) and the latter to judder. The company really can't catch a break.
Is there an issue? Send a MessageReason:
None


* UsefulNotes/HP keyboards and mice come with a somewhat jarring and inexcusable problem that causes the former to glitch (e.g. delay or not register keypresses or releases) and the latter to judder. The company really can't catch a break.

to:

* UsefulNotes/HP [[UsefulNotes/HP HP]] keyboards and mice come with a somewhat jarring and inexcusable problem that causes the former to glitch (e.g. delay or not register keypresses or releases) and the latter to judder. The company really can't catch a break.
Is there an issue? Send a MessageReason:
This is somewhat primitive at the moment; feel free to extend this.

Added DiffLines:

* UsefulNotes/HP keyboards and mice come with a somewhat jarring and inexcusable problem that causes the former to glitch (e.g. delay or not register keypresses or releases) and the latter to judder. The company really can't catch a break.
Is there an issue? Send a MessageReason:
None


*** The first Deluxe-class [[Film/{{Transformers}} 2007 movie]] Brawl figure has a partial auto-transforming gimmick that relies in internal translucent motion plastic gears, which tend to shatter and make the gimmick stop working. On top of that, the pegs attaching his arms to his shoulders are a different shape than the holes they're supposed to peg into. Thankfully, the toy was released some time later in new colors, which fixed all of these issues.

to:

*** The first Deluxe-class [[Film/{{Transformers}} [[Film/Transformers2007 2007 movie]] Brawl figure has a partial auto-transforming gimmick that relies in internal translucent motion plastic gears, which tend to shatter and make the gimmick stop working. On top of that, the pegs attaching his arms to his shoulders are a different shape than the holes they're supposed to peg into. Thankfully, the toy was released some time later in new colors, which fixed all of these issues.
Is there an issue? Send a MessageReason:
None


* The Lenovo Yoga folding tablet/laptop seems to have problems no matter what model you buy, but the 720-13IKB probably has it the worst with its power button. Being on the side, the power button interfaces with a small piece that is manipulated to press into the motherboard, which sends the signal to turn the laptop on and off. The problem is that [[https://www.youtube.com/watch?v=kb4p6ad-z1U this small piece is made of cheap aluminum, attached to a similarly-cheap plastic mounting piece]], making perhaps the most important button on the laptop ridiculously easy to break just from regular use. Even worse is that Lenovo apparently foresaw this problem and included a backup reset button on the other side of the laptop - except, in a brilliant display of hubris, this reset button uses the ''same'' cheap design and materials as the main power button, making it just as likely to break.

to:

* The Lenovo Yoga folding tablet/laptop seems to have problems no matter what model you buy, but the 720-13IKB probably has it the worst with its power button. Being on the side, the power button interfaces with a small piece that is manipulated to press into the motherboard, which sends the signal to turn the laptop on and off. The problem is that [[https://www.youtube.com/watch?v=kb4p6ad-z1U this small piece is made of cheap aluminum, attached to a similarly-cheap plastic mounting piece]], making perhaps the most important button on the laptop ridiculously easy to break just from regular use. Even worse is that Lenovo apparently foresaw this problem and included a backup reset button on the other side of the laptop - except, in but they apparently couldn't foresee ''why'' this would be a brilliant display of hubris, problem, because this reset button uses the ''same'' cheap design and materials as the main power button, making it just as likely to break.

Added: 894

Changed: 6

Is there an issue? Send a MessageReason:
None


* The Lenovo Yoga folding tablet/laptop seems to have problems no matter what model you buy, but the 720-13IKB probably has it the worst with its power button. Being on the side, the power button interfaces with a small piece that is manipulated to press into the motherboard, which sends the signal to turn the laptop on and off. The problem is that [[https://www.youtube.com/watch?v=kb4p6ad-z1U this small piece is made of cheap aluminum, attached to a similarly-cheap plastic mounting piece]], making perhaps the most important button on the laptop ridiculously easy to break just from regular use. Even worse is that Lenovo apparently foresaw this problem and included a backup reset button on the other side of the laptop - except, in a brilliant display of hubris, this reset button uses the ''same'' cheap design and materials as the main power button, making it just as likely to break.



** When AMD's Ryzen [=7800X3D=] processors were literally blowing up, analysis suggested that motherboard manufacturers were, with varying degrees, partly to blame. One manufacturer in particular was found to have an overcurrent protection chip (which would've helped prevent the CPU from blowing up) was set too high for too long, and this was on a board approaching $1000.

to:

** When AMD's Ryzen [=7800X3D=] processors were literally blowing up, analysis suggested that motherboard manufacturers were, with varying degrees, partly to blame. One manufacturer in particular was found to have an overcurrent protection chip (which would've helped prevent the CPU from blowing up) which was set too high for too long, and this was on a board approaching $1000.
Is there an issue? Send a MessageReason:
None


* The 80286 introduced "protected mode," the first memory management for x86 processors. The only problem is that it was impossible to switch back to "real mode," which many MS-DOS programs required to run. This is why Bill Gates called the chip "brain-damaged." The chip was still popular, powering the IBM PC AT and many clones, but mainly as just a fast 8088 processor. The 286 was popular for running multiuser, multitasking systems with the UsefulNotes/{{Unix}} variant XENIX, where the need to switch back to real mode was not an issue. The 80386 fixed this issue, being able to switch between modes.

to:

* The 80286 introduced "protected mode," the first memory management for x86 processors. The only problem is that it was impossible to switch back to "real mode," which many MS-DOS programs required to run. This is why Bill Gates called the chip "brain-damaged." The chip was still popular, powering the IBM PC AT and many clones, but mainly as just a fast 8088 processor. The 286 was popular for running multiuser, multitasking systems with the UsefulNotes/{{Unix}} Platform/{{UNIX}} variant XENIX, where the need to switch back to real mode was not an issue. The 80386 fixed this issue, being able to switch between modes.



** While the Platform/Atari5200 wasn't that poorly-designed of a system in general -- at worst, its absurdly huge size and power/RF combo switchbox could be annoying to deal with, but Atari eventually did away with the latter and were working on a smaller revision when [[UsefulNotes/TheGreatVideoGameCrashOf1983 the market crashed]], forcing them to discontinue the system -- its controllers were a different matter entirely. In many ways they were ahead of their time, with analogue movement along with start and pause buttons. Unfortunately, Atari cheaped out and didn't bother providing them with an auto-centring mechanism, along with building them out of such cheap materials that they usually tended to fail after a few months, if not ''weeks''. The poor-quality controllers subsequently played a major part in dooming the system.

to:

** While the Platform/Atari5200 wasn't that poorly-designed of a system in general -- at worst, its absurdly huge size and power/RF combo switchbox could be annoying to deal with, but Atari eventually did away with the latter and were working on a smaller revision when [[UsefulNotes/TheGreatVideoGameCrashOf1983 [[MediaNotes/TheGreatVideoGameCrashOf1983 the market crashed]], forcing them to discontinue the system -- its controllers were a different matter entirely. In many ways they were ahead of their time, with analogue movement along with start and pause buttons. Unfortunately, Atari cheaped out and didn't bother providing them with an auto-centring mechanism, along with building them out of such cheap materials that they usually tended to fail after a few months, if not ''weeks''. The poor-quality controllers subsequently played a major part in dooming the system.
Is there an issue? Send a MessageReason:
None


*** If you can't find a 486 computer, there's a build of UsefulNotes/DOSBox floating around ham circles with beefed-up code to slow down the environment even more than is possible by default. [=MTXs=] were very popular for 900MHz work because, aside from the battery issue, they were tough and cheap to get because of all the public agencies and companies that sold them off in bulk.

to:

*** If you can't find a 486 computer, there's a build of UsefulNotes/DOSBox Platform/DOSBox floating around ham circles with beefed-up code to slow down the environment even more than is possible by default. [=MTXs=] were very popular for 900MHz work because, aside from the battery issue, they were tough and cheap to get because of all the public agencies and companies that sold them off in bulk.



*** Finally, there was the inclusion of the Motorola 68000 CPU. It was intended to manage the functions of the "Tom" and "Jerry" chips, but since it just so happened to be the exact same chip used in the Platform/SegaGenesis, developers were more familiar with it as opposed to the poorly documented "Tom" and "Jerry" chips, and chose to use the 68000 as the system's main CPU instead of bothering to figure out how to balance the functions of the "Tom" and "Jerry" chips. The end result of all of this was a very difficult and cumbersome system to program for that was technically underwhelming, "64-bit"[[note]]We say 64-bit in quotes because it's debatable if the Jaguar was even a 64-bit console in the first place. The machine had a 64-bit object processor and blitter, but the two "Tom" and "Jerry" processors were 32-bit, meaning that all calculations were 32-bit, so most of the claims of being 64-bit were from Atari thinking two 32-bit processors "mathed up" to being a 64-bit system. [[UsefulNotes/HowVideoGameSpecsWork Not that it would have a meaningful impact on the system's graphical fidelity anyway.]][[/note]] capabilities be damned.

to:

*** Finally, there was the inclusion of the Motorola 68000 CPU. It was intended to manage the functions of the "Tom" and "Jerry" chips, but since it just so happened to be the exact same chip used in the Platform/SegaGenesis, developers were more familiar with it as opposed to the poorly documented "Tom" and "Jerry" chips, and chose to use the 68000 as the system's main CPU instead of bothering to figure out how to balance the functions of the "Tom" and "Jerry" chips. The end result of all of this was a very difficult and cumbersome system to program for that was technically underwhelming, "64-bit"[[note]]We say 64-bit in quotes because it's debatable if the Jaguar was even a 64-bit console in the first place. The machine had a 64-bit object processor and blitter, but the two "Tom" and "Jerry" processors were 32-bit, meaning that all calculations were 32-bit, so most of the claims of being 64-bit were from Atari thinking two 32-bit processors "mathed up" to being a 64-bit system. [[UsefulNotes/HowVideoGameSpecsWork [[MediaNotes/HowVideoGameSpecsWork Not that it would have a meaningful impact on the system's graphical fidelity anyway.]][[/note]] capabilities be damned.



* The Platform/VirtualBoy was a poorly-designed console in general, but perhaps the strangest design flaw was the complete absence of a head strap, which apparently ''was'' meant to have one, but was dropped. While this was ostensibly because of fears that the weight of the device could cause neck strain for younger players, for one thing pre-teens weren't officially supposed to be playing the device anyway, and for another thing the solution they came up with was a fixed 18-inch-tall stand that attached to the underside of the system. This meant that if you didn't have a table and chair that were the ''exact'' right height, you'd likely end up straining your neck and/or back anyway, in addition to the eye strain that the system was notorious for. Even the UsefulNotes/RZone, a notoriously poor ShoddyKnockOffProduct of the system, managed to make room for a head strap in the design.

to:

* The Platform/VirtualBoy was a poorly-designed console in general, but perhaps the strangest design flaw was the complete absence of a head strap, which apparently ''was'' meant to have one, but was dropped. While this was ostensibly because of fears that the weight of the device could cause neck strain for younger players, for one thing pre-teens weren't officially supposed to be playing the device anyway, and for another thing the solution they came up with was a fixed 18-inch-tall stand that attached to the underside of the system. This meant that if you didn't have a table and chair that were the ''exact'' right height, you'd likely end up straining your neck and/or back anyway, in addition to the eye strain that the system was notorious for. Even the UsefulNotes/RZone, Platform/RZone, a notoriously poor ShoddyKnockOffProduct of the system, managed to make room for a head strap in the design.



* The Kickstarter-funded UsefulNotes/{{Ouya}} console has gone down in history as having a huge raft of bad ideas:

to:

* The Kickstarter-funded UsefulNotes/{{Ouya}} Platform/{{Ouya}} console has gone down in history as having a huge raft of bad ideas:
Is there an issue? Send a MessageReason:
None


* [=MacBook=] disc drives are often finicky to use, sometimes not reading the disc at all and getting it stuck in the drive. The presented solutions? Restarting your computer and holding down the mouse button until it ejects. And even ''that'' isn't guaranteed - sometimes the disc will jut out just enough that [[NoSell the solution won't register at all]] and pushing it in with a pair of tweezers finishes the job. To put this in perspective, technologically inferior ''video game consoles'' like the UsefulNotes/{{Wii}} and UsefulNotes/PlayStation3 can do a slot-loading disc drive far better than Apple apparently can.

to:

* [=MacBook=] disc drives are often finicky to use, sometimes not reading the disc at all and getting it stuck in the drive. The presented solutions? Restarting your computer and holding down the mouse button until it ejects. And even ''that'' isn't guaranteed - sometimes the disc will jut out just enough that [[NoSell the solution won't register at all]] and pushing it in with a pair of tweezers finishes the job. To put this in perspective, technologically inferior ''video game consoles'' like the UsefulNotes/{{Wii}} Platform/{{Wii}} and UsefulNotes/PlayStation3 Platform/PlayStation3 can do a slot-loading disc drive far better than Apple apparently can.



** [[https://en.wikipedia.org/wiki/RDRAM RDRAM]] was touted by Intel and Rambus as a high-performance RAM for the Pentium III to be used in conjunction with the 820. But implementation-wise, it was not up to snuff (in fact, benchmarks revealed that applications ran slower with RDRAM than with the older SDRAM!), and very expensive; third-party chipset makers (such as [=SiS=], who gained some fame during this era) went to cheaper DDR RAM instead (and begrudgingly, so did Intel, leaving Rambus with egg on its face), which ultimately became the de facto industry standard. RDRAM still found use in other applications, though, like the UsefulNotes/Nintendo64 and UsefulNotes/PlayStation2... where it turned out to be one of the biggest performance bottlenecks on both systems -- the [=N64=] had twice the memory (and twice the bandwidth on it) of the UsefulNotes/{{PlayStation}}, but such high latency on it, combined with a ridiculously small buffer for textures loaded into memory to be applied, that it negated those advantages entirely, while the [=PS2=]'s memory, though having twice the clock speed of the UsefulNotes/{{Xbox}}'s older SDRAM, could only afford half as much memory and half the bandwidth on it, contributing to it having [[LoadsAndLoadsOfLoading the longest load times]] of its generation.

to:

** [[https://en.wikipedia.org/wiki/RDRAM RDRAM]] was touted by Intel and Rambus as a high-performance RAM for the Pentium III to be used in conjunction with the 820. But implementation-wise, it was not up to snuff (in fact, benchmarks revealed that applications ran slower with RDRAM than with the older SDRAM!), and very expensive; third-party chipset makers (such as [=SiS=], who gained some fame during this era) went to cheaper DDR RAM instead (and begrudgingly, so did Intel, leaving Rambus with egg on its face), which ultimately became the de facto industry standard. RDRAM still found use in other applications, though, like the UsefulNotes/Nintendo64 Platform/Nintendo64 and UsefulNotes/PlayStation2... Platform/PlayStation2... where it turned out to be one of the biggest performance bottlenecks on both systems -- the [=N64=] had twice the memory (and twice the bandwidth on it) of the UsefulNotes/{{PlayStation}}, Platform/{{PlayStation}}, but such high latency on it, combined with a ridiculously small buffer for textures loaded into memory to be applied, that it negated those advantages entirely, while the [=PS2=]'s memory, though having twice the clock speed of the UsefulNotes/{{Xbox}}'s Platform/{{Xbox}}'s older SDRAM, could only afford half as much memory and half the bandwidth on it, contributing to it having [[LoadsAndLoadsOfLoading the longest load times]] of its generation.



* The Coleco Adam, a 1983 computer based on the fairly successful UsefulNotes/ColecoVision console, suffered from a host of problems and baffling design decisions. Among the faults were the use of a proprietary tape drive which was prone to failure, locating the whole system's power supply in the ''printer'' (meaning the very limited daisy-wheel printer couldn't be replaced, and if it broke down or was absent, the whole computer was rendered unusable), and poor electromagnetic shielding which could lead to tapes and disks being ''erased'' at startup. Even after revised models ironed out the worst bugs, the system was discontinued after less than 2 years and sales of 100,000 units.

to:

* The Coleco Adam, a 1983 computer based on the fairly successful UsefulNotes/ColecoVision Platform/ColecoVision console, suffered from a host of problems and baffling design decisions. Among the faults were the use of a proprietary tape drive which was prone to failure, locating the whole system's power supply in the ''printer'' (meaning the very limited daisy-wheel printer couldn't be replaced, and if it broke down or was absent, the whole computer was rendered unusable), and poor electromagnetic shielding which could lead to tapes and disks being ''erased'' at startup. Even after revised models ironed out the worst bugs, the system was discontinued after less than 2 years and sales of 100,000 units.



** Likewise, the UsefulNotes/Atari7800 was an overall reasonably well-designed system, whose primary flaw was that it wasn't suited to the direction that the NES was pushing console game development in[[note]](the 7800 was designed to push huge amounts of sprites on static backgrounds, whereas the NES and later Master System were designed around smaller numbers of sprites on more complex, scrolling backgrounds)[[/note]]. One design decision did stand out as particularly baffling, however, namely the designers apparently deciding that they could save a few bucks by not bothering with a dedicated audio chip, and instead using the primitive sound hardware in the Atari 2600 graphics processor that was included for back-compatibility purposes. By the time the 7800 was widely released, however, console games were widely expected to have in-game music, meaning that 7800 games were either noticeably lacking in this regard or, worse still, tried to use the 2600 audio hardware to render both music and sound effects, usually with horribly shrill and beepy results, such as in [[https://www.youtube.com/watch?v=R4RPD4jnmOQ the system's port of Donkey Kong]]. Yes, the 7800's cartridge port had support for audio expansion via an optional POKEY sound chip on the cartridge, but since that meant the cost to give the 7800 proper sound capabilities suddenly fell on the developer, only two games ever included a POKEY chip, with all other games not bothering to include one due to increased manufacturing costs.
** The UsefulNotes/AtariJaguar suffered from this in spades:
*** The main hardware seemed to be designed with little foresight or consideration to developers. The "Tom" GPU could do texture-mapping, but it couldn't do it well (even the official documentation admits that texture-mapping slows the system to a crawl) thanks to VRAM access not being fast enough. This is the reason behind the flat, untextured look of many [=3D=] Jaguar games and meant that the system could not stand up graphically to even the [[UsefulNotes/ThreeDOInteractiveMultiplayer 3DO]] which came out around the same time, let alone the UsefulNotes/Nintendo64, Creator/{{Sony}} UsefulNotes/PlayStation, or even the UsefulNotes/SegaSaturn, [[TechnologyMarchesOn dooming the system to obsolescence right out of the gate.]] Yes, texture-mapping in video games was a fairly new thing in 1993 with the release of games like ''VideoGame/RidgeRacer'', but the 3DO came out a month prior to the Jaguar with full texture-mapping capabilities, so one would think someone at Atari would catch on and realize that texture-mapped graphics were the future.

to:

** Likewise, the UsefulNotes/Atari7800 Platform/Atari7800 was an overall reasonably well-designed system, whose primary flaw was that it wasn't suited to the direction that the NES was pushing console game development in[[note]](the 7800 was designed to push huge amounts of sprites on static backgrounds, whereas the NES and later Master System were designed around smaller numbers of sprites on more complex, scrolling backgrounds)[[/note]]. One design decision did stand out as particularly baffling, however, namely the designers apparently deciding that they could save a few bucks by not bothering with a dedicated audio chip, and instead using the primitive sound hardware in the Atari 2600 graphics processor that was included for back-compatibility purposes. By the time the 7800 was widely released, however, console games were widely expected to have in-game music, meaning that 7800 games were either noticeably lacking in this regard or, worse still, tried to use the 2600 audio hardware to render both music and sound effects, usually with horribly shrill and beepy results, such as in [[https://www.youtube.com/watch?v=R4RPD4jnmOQ the system's port of Donkey Kong]]. Yes, the 7800's cartridge port had support for audio expansion via an optional POKEY sound chip on the cartridge, but since that meant the cost to give the 7800 proper sound capabilities suddenly fell on the developer, only two games ever included a POKEY chip, with all other games not bothering to include one due to increased manufacturing costs.
** The UsefulNotes/AtariJaguar Platform/AtariJaguar suffered from this in spades:
*** The main hardware seemed to be designed with little foresight or consideration to developers. The "Tom" GPU could do texture-mapping, but it couldn't do it well (even the official documentation admits that texture-mapping slows the system to a crawl) thanks to VRAM access not being fast enough. This is the reason behind the flat, untextured look of many [=3D=] Jaguar games and meant that the system could not stand up graphically to even the [[UsefulNotes/ThreeDOInteractiveMultiplayer [[Platform/ThreeDOInteractiveMultiplayer 3DO]] which came out around the same time, let alone the UsefulNotes/Nintendo64, Platform/Nintendo64, Creator/{{Sony}} UsefulNotes/PlayStation, Platform/PlayStation, or even the UsefulNotes/SegaSaturn, Platform/SegaSaturn, [[TechnologyMarchesOn dooming the system to obsolescence right out of the gate.]] Yes, texture-mapping in video games was a fairly new thing in 1993 with the release of games like ''VideoGame/RidgeRacer'', but the 3DO came out a month prior to the Jaguar with full texture-mapping capabilities, so one would think someone at Atari would catch on and realize that texture-mapped graphics were the future.



*** Finally, there was the inclusion of the Motorola 68000 CPU. It was intended to manage the functions of the "Tom" and "Jerry" chips, but since it just so happened to be the exact same chip used in the UsefulNotes/SegaGenesis, developers were more familiar with it as opposed to the poorly documented "Tom" and "Jerry" chips, and chose to use the 68000 as the system's main CPU instead of bothering to figure out how to balance the functions of the "Tom" and "Jerry" chips. The end result of all of this was a very difficult and cumbersome system to program for that was technically underwhelming, "64-bit"[[note]]We say 64-bit in quotes because it's debatable if the Jaguar was even a 64-bit console in the first place. The machine had a 64-bit object processor and blitter, but the two "Tom" and "Jerry" processors were 32-bit, meaning that all calculations were 32-bit, so most of the claims of being 64-bit were from Atari thinking two 32-bit processors "mathed up" to being a 64-bit system. [[UsefulNotes/HowVideoGameSpecsWork Not that it would have a meaningful impact on the system's graphical fidelity anyway.]][[/note]] capabilities be damned.
*** The controller only included three main action buttons, a configuration which was already causing issues for the UsefulNotes/SegaGenesis at the time. In a baffling move, the controller also featured a numeric keypad, something that Atari had last done on the 5200. On that occasion the keypad was pretty superfluous and generally ignored by developers, but it was only taking up what would probably have been unused space on the controller, so it didn't do any harm by being there. The Jaguar's keypad, on the other hand, was far bigger, turning the controller into an ungodly monstrosity that has often been ranked as the absolute worst videogame controller of all-time.[[note]]Its main competition coming ironically from the 5200 controller, though that one's more often given a pass on the grounds that it would have been decent if Atari hadn't cut so many corners, and that by 1993 the company definitely should have known better.[[/note]] Atari later saw sense and produced a revised controller that added in three more command buttons and shoulder buttons, but for compatibility reasons they couldn't ditch the keypad - in fact, the five new buttons were just remaps of five of the keypad buttons - meaning that the newer version was similarly uncomfortable. The Jaguar's controller was in fact designed originally for the Atari Panther, their unreleased 32-bit console that was scheduled to come out in 1991 before it became obvious that the Genesis' 3-button configuration wasn't very future-proof. They evidently figured that the keypad gave them more than enough buttons and didn't bother creating a new controller for the Jaguar, a decision that would prove costly.

to:

*** Finally, there was the inclusion of the Motorola 68000 CPU. It was intended to manage the functions of the "Tom" and "Jerry" chips, but since it just so happened to be the exact same chip used in the UsefulNotes/SegaGenesis, Platform/SegaGenesis, developers were more familiar with it as opposed to the poorly documented "Tom" and "Jerry" chips, and chose to use the 68000 as the system's main CPU instead of bothering to figure out how to balance the functions of the "Tom" and "Jerry" chips. The end result of all of this was a very difficult and cumbersome system to program for that was technically underwhelming, "64-bit"[[note]]We say 64-bit in quotes because it's debatable if the Jaguar was even a 64-bit console in the first place. The machine had a 64-bit object processor and blitter, but the two "Tom" and "Jerry" processors were 32-bit, meaning that all calculations were 32-bit, so most of the claims of being 64-bit were from Atari thinking two 32-bit processors "mathed up" to being a 64-bit system. [[UsefulNotes/HowVideoGameSpecsWork Not that it would have a meaningful impact on the system's graphical fidelity anyway.]][[/note]] capabilities be damned.
*** The controller only included three main action buttons, a configuration which was already causing issues for the UsefulNotes/SegaGenesis Platform/SegaGenesis at the time. In a baffling move, the controller also featured a numeric keypad, something that Atari had last done on the 5200. On that occasion the keypad was pretty superfluous and generally ignored by developers, but it was only taking up what would probably have been unused space on the controller, so it didn't do any harm by being there. The Jaguar's keypad, on the other hand, was far bigger, turning the controller into an ungodly monstrosity that has often been ranked as the absolute worst videogame controller of all-time.[[note]]Its main competition coming ironically from the 5200 controller, though that one's more often given a pass on the grounds that it would have been decent if Atari hadn't cut so many corners, and that by 1993 the company definitely should have known better.[[/note]] Atari later saw sense and produced a revised controller that added in three more command buttons and shoulder buttons, but for compatibility reasons they couldn't ditch the keypad - in fact, the five new buttons were just remaps of five of the keypad buttons - meaning that the newer version was similarly uncomfortable. The Jaguar's controller was in fact designed originally for the Atari Panther, their unreleased 32-bit console that was scheduled to come out in 1991 before it became obvious that the Genesis' 3-button configuration wasn't very future-proof. They evidently figured that the keypad gave them more than enough buttons and didn't bother creating a new controller for the Jaguar, a decision that would prove costly.



** Most revisions of the original UsefulNotes/{{Xbox}} used a very cheap clock capacitor with such a high failure rate that it's basically guaranteed to break and leak all over the motherboard after a few years of normal use, far shorter than the normal lifetime of this type of component. Making this more annoying is that the clock capacitor is not an important part: it does not save time information if the system is unplugged for more than 30 minutes and the console works fine without it. The last major revision (1.6) of the system uses a different, better brand and is exempt from this issue.
** The infamous "Red Ring of Death" that occurs in some UsefulNotes/{{Xbox 360}} units. It was a consequence of three factors: the introduction of lead-free solder, which is toxicologically safer but harder to properly solder with; inconsistent quality of the solder itself, which got better in later years but was prone to cracking under stress in early revisions; and bad thermal design, where clearance issues with the DVD drive caused Microsoft to use a dinky little heatsink for chips that were known to run hot. Result: the chips would overheat, the defective and improperly-applied solder would crack from the heat expansion, and the connections would break.

to:

** Most revisions of the original UsefulNotes/{{Xbox}} Platform/{{Xbox}} used a very cheap clock capacitor with such a high failure rate that it's basically guaranteed to break and leak all over the motherboard after a few years of normal use, far shorter than the normal lifetime of this type of component. Making this more annoying is that the clock capacitor is not an important part: it does not save time information if the system is unplugged for more than 30 minutes and the console works fine without it. The last major revision (1.6) of the system uses a different, better brand and is exempt from this issue.
** The infamous "Red Ring of Death" that occurs in some UsefulNotes/{{Xbox Platform/{{Xbox 360}} units. It was a consequence of three factors: the introduction of lead-free solder, which is toxicologically safer but harder to properly solder with; inconsistent quality of the solder itself, which got better in later years but was prone to cracking under stress in early revisions; and bad thermal design, where clearance issues with the DVD drive caused Microsoft to use a dinky little heatsink for chips that were known to run hot. Result: the chips would overheat, the defective and improperly-applied solder would crack from the heat expansion, and the connections would break.



*** Microsoft recommends to not have the original UsefulNotes/XboxOne model in any position other than horizontal because the optical drive isn't designed for any orientation other than that. ''Every'' 360 model was rated to work in vertical orientation, even with the aforementioned scratching problem, and Microsoft quickly restored support for vertical orientation with the updated Xbox One S model.

to:

*** Microsoft recommends to not have the original UsefulNotes/XboxOne Platform/XboxOne model in any position other than horizontal because the optical drive isn't designed for any orientation other than that. ''Every'' 360 model was rated to work in vertical orientation, even with the aforementioned scratching problem, and Microsoft quickly restored support for vertical orientation with the updated Xbox One S model.



* When Nintendo of America's engineers redesigned the Famicom into the UsefulNotes/NintendoEntertainmentSystem, they removed the pins which allowed for cartridges to include add-on audio chips and rerouted them to the expansion slot on the bottom of the system in order to facilitate the western counterpart to the in-development Famicom Disk System. Unfortunately, not only was said counterpart never released, there was no real reason they couldn't have run audio expansion pins to both the cartridge slot and expansion port, other than the engineers wanting to save a few cents on the necessary IC. This meant that not only could no western NES game ever have any additional audio chips, it also disincentivised Japanese developers from using them, as it would entail reprogramming the entire soundtrack for a western release.\\

to:

* When Nintendo of America's engineers redesigned the Famicom into the UsefulNotes/NintendoEntertainmentSystem, Platform/NintendoEntertainmentSystem, they removed the pins which allowed for cartridges to include add-on audio chips and rerouted them to the expansion slot on the bottom of the system in order to facilitate the western counterpart to the in-development Famicom Disk System. Unfortunately, not only was said counterpart never released, there was no real reason they couldn't have run audio expansion pins to both the cartridge slot and expansion port, other than the engineers wanting to save a few cents on the necessary IC. This meant that not only could no western NES game ever have any additional audio chips, it also disincentivised Japanese developers from using them, as it would entail reprogramming the entire soundtrack for a western release.\\



* The UsefulNotes/SuperNintendoEntertainmentSystem was an overall improvement over the NES, doing away with the front-loading cartridge slot altogether and using better-quality pin connectors. That being said, there's still one notable flaw in the system in that the power plug on the back of the system is prone to breaking. Whereas the inner barrel of the power plug on the NES is made of metal, the SNES, on the other hand, uses a plastic barrel instead. This results in a more fragile power port in which the inner barrel breaks off from the stress of repeatedly plugging and unplugging the power cord from the system end. It's not uncommon to see used consoles with broken AC In plugs. Thankfully you can buy replacements, but you will need to know how to solder to replace the piece.
* The UsefulNotes/VirtualBoy was a poorly-designed console in general, but perhaps the strangest design flaw was the complete absence of a head strap, which apparently ''was'' meant to have one, but was dropped. While this was ostensibly because of fears that the weight of the device could cause neck strain for younger players, for one thing pre-teens weren't officially supposed to be playing the device anyway, and for another thing the solution they came up with was a fixed 18-inch-tall stand that attached to the underside of the system. This meant that if you didn't have a table and chair that were the ''exact'' right height, you'd likely end up straining your neck and/or back anyway, in addition to the eye strain that the system was notorious for. Even the UsefulNotes/RZone, a notoriously poor ShoddyKnockOffProduct of the system, managed to make room for a head strap in the design.

to:

* The UsefulNotes/SuperNintendoEntertainmentSystem Platform/SuperNintendoEntertainmentSystem was an overall improvement over the NES, doing away with the front-loading cartridge slot altogether and using better-quality pin connectors. That being said, there's still one notable flaw in the system in that the power plug on the back of the system is prone to breaking. Whereas the inner barrel of the power plug on the NES is made of metal, the SNES, on the other hand, uses a plastic barrel instead. This results in a more fragile power port in which the inner barrel breaks off from the stress of repeatedly plugging and unplugging the power cord from the system end. It's not uncommon to see used consoles with broken AC In plugs. Thankfully you can buy replacements, but you will need to know how to solder to replace the piece.
* The UsefulNotes/VirtualBoy Platform/VirtualBoy was a poorly-designed console in general, but perhaps the strangest design flaw was the complete absence of a head strap, which apparently ''was'' meant to have one, but was dropped. While this was ostensibly because of fears that the weight of the device could cause neck strain for younger players, for one thing pre-teens weren't officially supposed to be playing the device anyway, and for another thing the solution they came up with was a fixed 18-inch-tall stand that attached to the underside of the system. This meant that if you didn't have a table and chair that were the ''exact'' right height, you'd likely end up straining your neck and/or back anyway, in addition to the eye strain that the system was notorious for. Even the UsefulNotes/RZone, a notoriously poor ShoddyKnockOffProduct of the system, managed to make room for a head strap in the design.



* The UsefulNotes/{{Wii}} has no crash handler. So if you manage to crash your system, you open it up to Arbitrary Code Execution, and a whole load of security vulnerabilities await you. Do you have an SD card inserted? Well, crash any game that reads and writes to it and even ''more'' vulnerabilities open up. They'll tell you that they fixed these vulnerabilities though system updates, but in reality, they never did. In fact, the only thing these updates did on that matter was simply remove anything that was installed ''with'' these vulnerabilities - nothing's stopping you from using these vulnerabilities again to ''re''-install them. [[GoodBadBugs All of this is a good thing if you like modding your console.]]
* UsefulNotes/WiiU:
** While the console ultimately proved a failure for several reasons, poor component choices helped contribute to its near-total lack of third-party support. It'd only be a slight exaggeration to say that the system's CPU was just three heavily overclocked Wii [=CPUs=] -- the Wii's own CPU in turn being just a higher-clocked version of the CPU that had been used a decade earlier in the UsefulNotes/NintendoGameCube -- slapped together on the same die, with performance that was abysmally poor by 2012 standards. Its GPU, while not ''as'' slow, wasn't all that much faster than those of the [=PS3=] and Xbox 360,[[note]]in fact, on paper the Wii U's GPU was actually ''slower'' than the [=GPUs=] of either of those consoles, but due to advances in technology was about twice as fast in practice... unless you offloaded code to the GPU to offset the lack of CPU grunt, which would bring it back down to, if not below, [=PS3/360=] levels[[/note]] and used a shader model in-between those of the older consoles and their successors, meaning that ported [=PS3=]/360 games didn't take advantage of the newer hardware, while games designed for the [=PS4=] and Xbox One wouldn't even work to begin with due to the lack of necessary feature support. While Nintendo likely stuck with the [=PowerPC=] architecture for UsefulNotes/BackwardsCompatibility reasons, the system would likely have fared much better if Nintendo had just grabbed an off-the-shelf AMD laptop APU - which had enough power even in 2012 to brute-force emulate the Wii, eliminating the main reason to keep with the [=PowerPC=] line - stuffed it into a Wii case and called it a day. Fortunately, Nintendo seems to have learned from this, basing the UsefulNotes/NintendoSwitch on an existing [=nVidia=] mobile chip which thus far has proven surprisingly capable of punching above its weight.

to:

* The UsefulNotes/{{Wii}} Platform/{{Wii}} has no crash handler. So if you manage to crash your system, you open it up to Arbitrary Code Execution, and a whole load of security vulnerabilities await you. Do you have an SD card inserted? Well, crash any game that reads and writes to it and even ''more'' vulnerabilities open up. They'll tell you that they fixed these vulnerabilities though system updates, but in reality, they never did. In fact, the only thing these updates did on that matter was simply remove anything that was installed ''with'' these vulnerabilities - nothing's stopping you from using these vulnerabilities again to ''re''-install them. [[GoodBadBugs All of this is a good thing if you like modding your console.]]
* UsefulNotes/WiiU:
Platform/WiiU:
** While the console ultimately proved a failure for several reasons, poor component choices helped contribute to its near-total lack of third-party support. It'd only be a slight exaggeration to say that the system's CPU was just three heavily overclocked Wii [=CPUs=] -- the Wii's own CPU in turn being just a higher-clocked version of the CPU that had been used a decade earlier in the UsefulNotes/NintendoGameCube Platform/NintendoGameCube -- slapped together on the same die, with performance that was abysmally poor by 2012 standards. Its GPU, while not ''as'' slow, wasn't all that much faster than those of the [=PS3=] and Xbox 360,[[note]]in fact, on paper the Wii U's GPU was actually ''slower'' than the [=GPUs=] of either of those consoles, but due to advances in technology was about twice as fast in practice... unless you offloaded code to the GPU to offset the lack of CPU grunt, which would bring it back down to, if not below, [=PS3/360=] levels[[/note]] and used a shader model in-between those of the older consoles and their successors, meaning that ported [=PS3=]/360 games didn't take advantage of the newer hardware, while games designed for the [=PS4=] and Xbox One wouldn't even work to begin with due to the lack of necessary feature support. While Nintendo likely stuck with the [=PowerPC=] architecture for UsefulNotes/BackwardsCompatibility reasons, the system would likely have fared much better if Nintendo had just grabbed an off-the-shelf AMD laptop APU - which had enough power even in 2012 to brute-force emulate the Wii, eliminating the main reason to keep with the [=PowerPC=] line - stuffed it into a Wii case and called it a day. Fortunately, Nintendo seems to have learned from this, basing the UsefulNotes/NintendoSwitch Platform/NintendoSwitch on an existing [=nVidia=] mobile chip which thus far has proven surprisingly capable of punching above its weight.



* UsefulNotes/NintendoSwitch:

to:

* UsefulNotes/NintendoSwitch:Platform/NintendoSwitch:



** The UsefulNotes/SegaSaturn is, despite its admitted strong points on the player end, seen as one of the worst major consoles internally. It was originally intended to be the best 2D gaming system out there (which it was), so its design was directly based on the 32X with higher clockspeeds, more memory, and CD storage. However, partway through development Sega learned of Sony's and Nintendo's upcoming systems (the [=PlayStation=] and [=Nintendo 64=] respectively) which were both designed with 3D games in mind, and realized the market - especially in their North America stronghold - was about to shift under their feet; they wouldn't have a prayer of competing. So, in an effort to try to bring more and more power to the console, Sega added an extra CPU and GPU to the system, which sounds great at first... until you consider that there were also ''six other processors'' that couldn't interface too well. This also made the motherboard prohibitively complex, being the most expensive console at the time. And lastly, much like the infamous Nvidia [=NV1=] which has its own example in this very page, the GPU worked on four-sided basic primitives while the industry standard was three sides, a significant hurdle for multiplatform games as those developed with triangular primitives would require extensive porting work to adapt them to quads. All this piled-on complication made development on the Saturn a nightmare. Ironically, consoles with multiple CPU cores would become commonplace two generations later with the Xbox 360 and [=PlayStation=] 3; like a lot of Sega's various other products of that era, they had attempted to push new features before game developers were really ready to make use of them.
** The UsefulNotes/SegaDreamcast, for the most part, was a solidly designed machine overall, taking many lessons learned from the Saturn and applying them to the Dreamcast. That didn't mean it didn't had its own share of mistakes:
*** Ironically, one aspect of the Dreamcast that is much worse than the Saturn is the system's security. The Saturn had a robust security system similar to the [[UsefulNotes/PlayStation Sony PlayStation]] that took decades to defeat,[[note]]The simplified explanation is that a "wobble groove" was read off the disc, and if it didn't see it, it didn't boot[[/note]] so when Sega was designing the Dreamcast's copy protection mechanism, they took what they learned from the Saturn and threw it in the garbage in favor of using a proprietary GD-ROM format (this same format was also being used on their arcade hardware at the time) to boot games from as the console's sole security. On paper, this seemed like a good idea, but there was one gaping hole in the Dreamcast's security system: the system can accept ''another'' of Sega's proprietary formats called MIL-CD, which was like an Enhanced CD but with Dreamcast-specific features. The format was a major flop with no [=MIL-CDs=] ever releasing outside of Japan, but pirates quickly figured out that MIL-CD had no hardware-level copy protection. A MIL-CD's boot data was scrambled from the factory, and the Dreamcast contained an "unscrambler" that would descramble the boot data into something readable and boot into the game. A Dreamcast SDK was all pirates needed to defeat this, and running pirated games on the Dreamcast was as easy as burning a cracked ISO onto a CD-R and putting it in the machine. Sega removed MIL-CD support on a late revision of the Dreamcast to combat this, but it was too late, and the Dreamcast would become the most pirated disc-based home console of all time.
*** On a lesser note, the Dreamcast could only read 100KB of save data from a single VMU at a time, with no room for expansion. Compared to the UsefulNotes/PlayStation2's gargantuan 8MB memory card, this was absolutely ''tiny''. They attempted to release a [=4X=] Memory Card, but it had to work around the Dreamcast's design flaw of only reading 100KB from a card at a time by separating the 400KB of space into four "pages". The major downside is that games couldn't be stored over multiple pages as it was four memory cards in one, and some games wouldn't detect it or would outright ''[[GameBreakingBug crash]]'' when trying to read from it. You also couldn't copy game saves between pages without a second memory card, and the 4X Memory Card didn't support any VMU-specific features, as it lacked a screen and face buttons.

to:

** The UsefulNotes/SegaSaturn Platform/SegaSaturn is, despite its admitted strong points on the player end, seen as one of the worst major consoles internally. It was originally intended to be the best 2D gaming system out there (which it was), so its design was directly based on the 32X with higher clockspeeds, more memory, and CD storage. However, partway through development Sega learned of Sony's and Nintendo's upcoming systems (the [=PlayStation=] and [=Nintendo 64=] respectively) which were both designed with 3D games in mind, and realized the market - especially in their North America stronghold - was about to shift under their feet; they wouldn't have a prayer of competing. So, in an effort to try to bring more and more power to the console, Sega added an extra CPU and GPU to the system, which sounds great at first... until you consider that there were also ''six other processors'' that couldn't interface too well. This also made the motherboard prohibitively complex, being the most expensive console at the time. And lastly, much like the infamous Nvidia [=NV1=] which has its own example in this very page, the GPU worked on four-sided basic primitives while the industry standard was three sides, a significant hurdle for multiplatform games as those developed with triangular primitives would require extensive porting work to adapt them to quads. All this piled-on complication made development on the Saturn a nightmare. Ironically, consoles with multiple CPU cores would become commonplace two generations later with the Xbox 360 and [=PlayStation=] 3; like a lot of Sega's various other products of that era, they had attempted to push new features before game developers were really ready to make use of them.
** The UsefulNotes/SegaDreamcast, Platform/SegaDreamcast, for the most part, was a solidly designed machine overall, taking many lessons learned from the Saturn and applying them to the Dreamcast. That didn't mean it didn't had its own share of mistakes:
*** Ironically, one aspect of the Dreamcast that is much worse than the Saturn is the system's security. The Saturn had a robust security system similar to the [[UsefulNotes/PlayStation [[Platform/PlayStation Sony PlayStation]] that took decades to defeat,[[note]]The simplified explanation is that a "wobble groove" was read off the disc, and if it didn't see it, it didn't boot[[/note]] so when Sega was designing the Dreamcast's copy protection mechanism, they took what they learned from the Saturn and threw it in the garbage in favor of using a proprietary GD-ROM format (this same format was also being used on their arcade hardware at the time) to boot games from as the console's sole security. On paper, this seemed like a good idea, but there was one gaping hole in the Dreamcast's security system: the system can accept ''another'' of Sega's proprietary formats called MIL-CD, which was like an Enhanced CD but with Dreamcast-specific features. The format was a major flop with no [=MIL-CDs=] ever releasing outside of Japan, but pirates quickly figured out that MIL-CD had no hardware-level copy protection. A MIL-CD's boot data was scrambled from the factory, and the Dreamcast contained an "unscrambler" that would descramble the boot data into something readable and boot into the game. A Dreamcast SDK was all pirates needed to defeat this, and running pirated games on the Dreamcast was as easy as burning a cracked ISO onto a CD-R and putting it in the machine. Sega removed MIL-CD support on a late revision of the Dreamcast to combat this, but it was too late, and the Dreamcast would become the most pirated disc-based home console of all time.
*** On a lesser note, the Dreamcast could only read 100KB of save data from a single VMU at a time, with no room for expansion. Compared to the UsefulNotes/PlayStation2's Platform/PlayStation2's gargantuan 8MB memory card, this was absolutely ''tiny''. They attempted to release a [=4X=] Memory Card, but it had to work around the Dreamcast's design flaw of only reading 100KB from a card at a time by separating the 400KB of space into four "pages". The major downside is that games couldn't be stored over multiple pages as it was four memory cards in one, and some games wouldn't detect it or would outright ''[[GameBreakingBug crash]]'' when trying to read from it. You also couldn't copy game saves between pages without a second memory card, and the 4X Memory Card didn't support any VMU-specific features, as it lacked a screen and face buttons.



** The usage of an optical disc format on the PSP can qualify. On paper, it made perfect sense to choose optical discs over cartridges because of the former's storage size advantages and relative low manufacturing expense, and it enabled the release of high-quality video on the PSP. After all, utilizing optical discs instead of cartridges had propelled the original UsefulNotes/PlayStation to much greater success than [[UsefulNotes/Nintendo64 its competition from Nintendo]], so there was no reason to assume the same thing wouldn't happen again in the handheld market. However, in practice, optical discs quickly proved to be a poor fit for a handheld system: Sony's Universal Media Disc (UMD) was fragile, as there have been many cases of the outer plastic shell which protected the actual data disc cracking and breaking, rendering the disc useless until the user buys an aftermarket replacement shell. In addition, it wasn't uncommon for the PSP's UMD drive to fail due to wear of the moving parts. UMD also failed to catch on as a video format due to their proprietary technology making them more expensive to produce than other optical disc formats, and thus were priced higher than [=DVDs=] despite holding noticeably less data.[[note]]UMD holds at most 1.8 gigabytes on a dual-layer disc; DVD holds at ''least'' 4.7 on a single-layer disc, more than two and a half times that.[[/note]] There were also no devices besides the PSP that could play UMD movies, meaning that you were stuck watching your UMD movies on the PSP's small screen, and couldn't swap them with your friends unless they too had a PSP. This drove away consumers, who would rather purchase a portable DVD player and have access to a cheaper media library, while the more tech-savvy can rip their [=DVDs=] and put them on Memory Sticks to watch on PSP without a separate disc. In addition, [[LoadsAndLoadsOfLoading UMD load times were]] ''[[LoadsAndLoadsOfLoading long]]'', compared to that of a standard UsefulNotes/NintendoDS cartridge, which Sony themselves tried to fix by doubling the memory of the PSP in later models to use as a UMD cache. By 2009, Sony themselves were trying to phase out the UMD with the PSP Go, which did not have a UMD drive and relied on digital downloads from the [=PlayStation=] Store, but it was too late; most games were already released on UMD while very few were actually made available digitally, and so the Go blocked off a major portion of PSP games without offering any real advantage to make up for it because the [=PlayStation=] Store was available on ''all'' PSP models at the time, which led to consumers ignoring the Go. Sony's decision to use standard cartridges for the PSP's successor, the UsefulNotes/PlayStationVita, seemed like a tacit admission that using UMD on the PSP was a mistake.

to:

** The usage of an optical disc format on the PSP can qualify. On paper, it made perfect sense to choose optical discs over cartridges because of the former's storage size advantages and relative low manufacturing expense, and it enabled the release of high-quality video on the PSP. After all, utilizing optical discs instead of cartridges had propelled the original UsefulNotes/PlayStation Platform/PlayStation to much greater success than [[UsefulNotes/Nintendo64 [[Platform/Nintendo64 its competition from Nintendo]], so there was no reason to assume the same thing wouldn't happen again in the handheld market. However, in practice, optical discs quickly proved to be a poor fit for a handheld system: Sony's Universal Media Disc (UMD) was fragile, as there have been many cases of the outer plastic shell which protected the actual data disc cracking and breaking, rendering the disc useless until the user buys an aftermarket replacement shell. In addition, it wasn't uncommon for the PSP's UMD drive to fail due to wear of the moving parts. UMD also failed to catch on as a video format due to their proprietary technology making them more expensive to produce than other optical disc formats, and thus were priced higher than [=DVDs=] despite holding noticeably less data.[[note]]UMD holds at most 1.8 gigabytes on a dual-layer disc; DVD holds at ''least'' 4.7 on a single-layer disc, more than two and a half times that.[[/note]] There were also no devices besides the PSP that could play UMD movies, meaning that you were stuck watching your UMD movies on the PSP's small screen, and couldn't swap them with your friends unless they too had a PSP. This drove away consumers, who would rather purchase a portable DVD player and have access to a cheaper media library, while the more tech-savvy can rip their [=DVDs=] and put them on Memory Sticks to watch on PSP without a separate disc. In addition, [[LoadsAndLoadsOfLoading UMD load times were]] ''[[LoadsAndLoadsOfLoading long]]'', compared to that of a standard UsefulNotes/NintendoDS Platform/NintendoDS cartridge, which Sony themselves tried to fix by doubling the memory of the PSP in later models to use as a UMD cache. By 2009, Sony themselves were trying to phase out the UMD with the PSP Go, which did not have a UMD drive and relied on digital downloads from the [=PlayStation=] Store, but it was too late; most games were already released on UMD while very few were actually made available digitally, and so the Go blocked off a major portion of PSP games without offering any real advantage to make up for it because the [=PlayStation=] Store was available on ''all'' PSP models at the time, which led to consumers ignoring the Go. Sony's decision to use standard cartridges for the PSP's successor, the UsefulNotes/PlayStationVita, Platform/PlayStationVita, seemed like a tacit admission that using UMD on the PSP was a mistake.



** Reliability issues aside, the [=PS3=]'s actual hardware wasn't exactly a winner either, and depending on who you ask, the [=PS3=] was either a well-designed if misunderstood piece of tech or had the worst internal console architecture since the UsefulNotes/SegaSaturn. Ken Kutaragi envisioned the [=PlayStation=] 3 as "a supercomputer for the home", and as a result Sony implemented their Cell Broadband Engine processor, co-developed by IBM and Toshiba for supercomputer applications, into the console. While this in theory made the console much more powerful than the Xbox 360, in practice this made the system exponentially more difficult to program for as the CPU was not designed with video games in mind. In layman's terms, it featured eight individually programmable "cores", one general-purpose, but the others much more specialized and had limited access to the rest of the system. Contrast to, say, the Xbox 360's Xenon processor, which used a much more conventional three-general purpose core architecture that was much easier to program, and that was exactly the [=PS3=]'s downfall from a hardware standpoint. The Cell processor's unconventional architecture meant that it was notoriously difficult to write efficient code for (for comparison, a program that only consists of a few lines of code could easily consist of hundreds of lines if converted to Cell code), and good luck rewriting code designed for conventional processors into code for the Cell processor, which explains why many multi-platform games ran better on the 360 but worse on [=PS3=]: many developers weren't so keen on spending development resources rewriting their game to run properly on the [=PS3=], so what they would do instead is run the game on the general-purpose core and ignore the rest, effectively using only a fraction of the system's power. While developers would later put out some visually stunning games for the system, Sony saw the writing on the wall that the industry had moved towards favoring ease of porting across platforms over individual power brought by architecture that is highly-bespoke but hard to work with, and Sony abandoned weird, proprietary chipsets in favor of off-the-shelf, easier to program for AMD processors for their [=PS4=] and onward.
** A common criticism of the UsefulNotes/PlaystationVita is that managing your games and saves is a tremendous hassle: for some reason, deleting a Vita game will also delete its save files, meaning that if you want to make room for a new game you'll have to kiss your progress goodbye. This can be circumvented by transferring the files to a PC or uploading them to the cloud, but the latter requires a [=PlayStation=] Plus subscription to use. One wonders why they don't allow you to simply keep the save file like the [=PS1=] and PSP games do. This is made all the more annoying by the Vita's notoriously small and overpriced proprietary memory cards (itself possibly based on Sony's failed Memory Stick Micro M2 format, not to be confused with M.2 solid state drives, as they have very similar form factors, but the M2 is not compatible with the Vita), which means that if you buy a lot of games in digital format, you probably won't be able to hold your whole collection at the same time, even if you shell out big money for a [=32GB=] (the biggest widely-available format, about $60) or [=64GB=] ([[NoExportForYou must be imported from Japan]], can cost over $100, and is reported to sometimes suffer issues such as slow loading, game crashes, and data loss) card.

to:

** Reliability issues aside, the [=PS3=]'s actual hardware wasn't exactly a winner either, and depending on who you ask, the [=PS3=] was either a well-designed if misunderstood piece of tech or had the worst internal console architecture since the UsefulNotes/SegaSaturn.Platform/SegaSaturn. Ken Kutaragi envisioned the [=PlayStation=] 3 as "a supercomputer for the home", and as a result Sony implemented their Cell Broadband Engine processor, co-developed by IBM and Toshiba for supercomputer applications, into the console. While this in theory made the console much more powerful than the Xbox 360, in practice this made the system exponentially more difficult to program for as the CPU was not designed with video games in mind. In layman's terms, it featured eight individually programmable "cores", one general-purpose, but the others much more specialized and had limited access to the rest of the system. Contrast to, say, the Xbox 360's Xenon processor, which used a much more conventional three-general purpose core architecture that was much easier to program, and that was exactly the [=PS3=]'s downfall from a hardware standpoint. The Cell processor's unconventional architecture meant that it was notoriously difficult to write efficient code for (for comparison, a program that only consists of a few lines of code could easily consist of hundreds of lines if converted to Cell code), and good luck rewriting code designed for conventional processors into code for the Cell processor, which explains why many multi-platform games ran better on the 360 but worse on [=PS3=]: many developers weren't so keen on spending development resources rewriting their game to run properly on the [=PS3=], so what they would do instead is run the game on the general-purpose core and ignore the rest, effectively using only a fraction of the system's power. While developers would later put out some visually stunning games for the system, Sony saw the writing on the wall that the industry had moved towards favoring ease of porting across platforms over individual power brought by architecture that is highly-bespoke but hard to work with, and Sony abandoned weird, proprietary chipsets in favor of off-the-shelf, easier to program for AMD processors for their [=PS4=] and onward.
** A common criticism of the UsefulNotes/PlaystationVita Platform/PlaystationVita is that managing your games and saves is a tremendous hassle: for some reason, deleting a Vita game will also delete its save files, meaning that if you want to make room for a new game you'll have to kiss your progress goodbye. This can be circumvented by transferring the files to a PC or uploading them to the cloud, but the latter requires a [=PlayStation=] Plus subscription to use. One wonders why they don't allow you to simply keep the save file like the [=PS1=] and PSP games do. This is made all the more annoying by the Vita's notoriously small and overpriced proprietary memory cards (itself possibly based on Sony's failed Memory Stick Micro M2 format, not to be confused with M.2 solid state drives, as they have very similar form factors, but the M2 is not compatible with the Vita), which means that if you buy a lot of games in digital format, you probably won't be able to hold your whole collection at the same time, even if you shell out big money for a [=32GB=] (the biggest widely-available format, about $60) or [=64GB=] ([[NoExportForYou must be imported from Japan]], can cost over $100, and is reported to sometimes suffer issues such as slow loading, game crashes, and data loss) card.



** While the UsefulNotes/PlayStation4 is mostly a well-built console, it has an AchillesHeel in that the heat exhaust vents on the back are too large. The heat produced by the system invites insects to crawl inside the console, which can then short-circuit the console if they step on the wrong things. If you live in an area where insects are hard to avoid or get rid of, owning a [=PS4=] becomes a major crapshoot.

to:

** While the UsefulNotes/PlayStation4 Platform/PlayStation4 is mostly a well-built console, it has an AchillesHeel in that the heat exhaust vents on the back are too large. The heat produced by the system invites insects to crawl inside the console, which can then short-circuit the console if they step on the wrong things. If you live in an area where insects are hard to avoid or get rid of, owning a [=PS4=] becomes a major crapshoot.



* After insulting the childishness of the [[UsefulNotes/GameBoyAdvance GBA]] through PR, Nokia created the complete joke of a design that was the original N-Gage. As a phone, the only way you could speak or hear anything effectively is if the user held the thin side of the unit to his/her ear (earning it the derisive nickname "taco phone" and the infamous "sidetalking"). From a gaming point of view it was even worse, as the screen was oriented vertically instead of horizontally like most handhelds, limiting the player's ability to see the game field (very problematic with games like the N-Gage port of ''[[VideoGame/SonicAdvanceTrilogy Sonic Advance]]''). Worst of all, however, is the fact that in order to change games one had to remove the casing and the battery every single time.

to:

* After insulting the childishness of the [[UsefulNotes/GameBoyAdvance [[Platform/GameBoyAdvance GBA]] through PR, Nokia created the complete joke of a design that was the original N-Gage. As a phone, the only way you could speak or hear anything effectively is if the user held the thin side of the unit to his/her ear (earning it the derisive nickname "taco phone" and the infamous "sidetalking"). From a gaming point of view it was even worse, as the screen was oriented vertically instead of horizontally like most handhelds, limiting the player's ability to see the game field (very problematic with games like the N-Gage port of ''[[VideoGame/SonicAdvanceTrilogy Sonic Advance]]''). Worst of all, however, is the fact that in order to change games one had to remove the casing and the battery every single time.



* Much like the Atari 5200, the UsefulNotes/{{Intellivision}} wasn't a too badly-designed console in general - the only major issue was that its non-standard CPU design meant development wasn't quite as straightforward as on its contemporaries - but the controllers were a big issue. Instead of a conventional joystick, they used a flat disc that you had to press down on, which rapidly became tiring and didn't allow for very precise control. The action buttons on the side were also rather small and squat, making them difficult to push when you really needed to. However, by far the biggest issue was that Mattel for some reason decided that the controllers should be hard-wired to the console, making it impossible to swap them out for third-party alternatives or buy extension cables for players who preferred to sit further away from their TV set. The controller issues have been ascribed as one of the major issues why the Intellivision "only" managed to carve a niche out as the main alternative to the UsefulNotes/Atari2600 in its early years, while the UsefulNotes/{{Colecovision}} - which was more powerful, easier to develop for, and despite having a controller that was just as bad, if not worse than the Intellivision's, could actually be swapped for third-party alternatives - started thoroughly clobbering the 2600 in sales when it arrived on the scene a few years later.

to:

* Much like the Atari 5200, the UsefulNotes/{{Intellivision}} Platform/{{Intellivision}} wasn't a too badly-designed console in general - the only major issue was that its non-standard CPU design meant development wasn't quite as straightforward as on its contemporaries - but the controllers were a big issue. Instead of a conventional joystick, they used a flat disc that you had to press down on, which rapidly became tiring and didn't allow for very precise control. The action buttons on the side were also rather small and squat, making them difficult to push when you really needed to. However, by far the biggest issue was that Mattel for some reason decided that the controllers should be hard-wired to the console, making it impossible to swap them out for third-party alternatives or buy extension cables for players who preferred to sit further away from their TV set. The controller issues have been ascribed as one of the major issues why the Intellivision "only" managed to carve a niche out as the main alternative to the UsefulNotes/Atari2600 Platform/Atari2600 in its early years, while the UsefulNotes/{{Colecovision}} Platform/{{Colecovision}} - which was more powerful, easier to develop for, and despite having a controller that was just as bad, if not worse than the Intellivision's, could actually be swapped for third-party alternatives - started thoroughly clobbering the 2600 in sales when it arrived on the scene a few years later.



* The UsefulNotes/PhilipsCDi has a rather egregious design flaw: On some models (it's not exactly known which), when the internal battery dies, in addition to losing your save files and functions for the internal clock, there is a chance that a file stored in RAM required to boot the system can either be ''lost or corrupted'', turning it into a regular CD player at best and effectively bricking the system at worst (read [[https://cdii.blogspot.com/2020/04/frequently-asked-questions-about-cd-is.html here]] for more information). Replacing the battery is no easy feat either: it's buried inside a "Timekeeper" chip, and the plastic casing must either be drilled away or the chip must be replaced entirely. Contrast this to the Sega Saturn where the battery is easily accessable in the system's Expansion Port (and when it dies, the worst that happens is that your save data, which can be backed up to an external memory cartridge, is deleted), or the Sony [=PlayStation=] which doesn't have an internal battery at all.

to:

* The UsefulNotes/PhilipsCDi Platform/PhilipsCDi has a rather egregious design flaw: On some models (it's not exactly known which), when the internal battery dies, in addition to losing your save files and functions for the internal clock, there is a chance that a file stored in RAM required to boot the system can either be ''lost or corrupted'', turning it into a regular CD player at best and effectively bricking the system at worst (read [[https://cdii.blogspot.com/2020/04/frequently-asked-questions-about-cd-is.html here]] for more information). Replacing the battery is no easy feat either: it's buried inside a "Timekeeper" chip, and the plastic casing must either be drilled away or the chip must be replaced entirely. Contrast this to the Sega Saturn where the battery is easily accessable in the system's Expansion Port (and when it dies, the worst that happens is that your save data, which can be backed up to an external memory cartridge, is deleted), or the Sony [=PlayStation=] which doesn't have an internal battery at all.



** Later in the system's life, it became abundantly clear that Philips did ''not'' design the CD-i with video games in mind due to lacking several game-specific hardware features like sprite-scaling and sprite-rotation, things that the Super Nintendo and Sega Genesis had to an extent (and were fully capable of with an additional graphics co-processor, and the latter system could push very primitive, untextured polygons without any help). The CD-i was more or less designed for basic interactive software, such as point-and-click edutainment games, so when Philips tried to shift the focus of the system to full-fledged video games such as the ill-fated ''VideoGame/HotelMario'' and the ''[[VideoGame/TheLegendOfZeldaCDiGames Zelda]]'' CD-i trilogy, these games already looked dated and primitive compared to games such as ''VideoGame/DonkeyKongCountry'' and ''VideoGame/StarFox''. Any possible opportunities to rectify this via hardware revisions were stonewalled by Philips' CD-i Green Book standard (which was both a data specification ''and'' a hardware specification), who wouldn't budge on system specs, prioritizing wide compatibility over keeping their hardware up to date. This doomed the CD-i as a video game platform, especially as it was being supplanted by more powerful machines such as the [[UsefulNotes/ThreeDOInteractiveMultiplayer 3DO]] (which could do everything the CD-i could do except better), the UsefulNotes/SegaSaturn, the UsefulNotes/Nintendo64, and most damningly, the Creator/{{Sony}} UsefulNotes/PlayStation. Video games aside, this also ensured that the CD-i could not keep up with the [[TechnologyMarchesOn rapidly evolving technology of the era]], leaving it in the dust of more capable media appliances such as the DVD player.
* The Nuon was a hardware standard for DVD players that enabled [=3D=] video games to be played on the system in addition to offering enhanced DVD playback features. While it was intended as a competitor to contemporary sixth-generation video game consoles such as the UsefulNotes/PlayStation2, UsefulNotes/GameCube, and UsefulNotes/{{Xbox}}, its hardware was woefully underpowered for the era, barely outperforming the Nintendo 64. This and the fact that Nuon circuitry was offered on so little models ensured that the format was dead on arrival.

to:

** Later in the system's life, it became abundantly clear that Philips did ''not'' design the CD-i with video games in mind due to lacking several game-specific hardware features like sprite-scaling and sprite-rotation, things that the Super Nintendo and Sega Genesis had to an extent (and were fully capable of with an additional graphics co-processor, and the latter system could push very primitive, untextured polygons without any help). The CD-i was more or less designed for basic interactive software, such as point-and-click edutainment games, so when Philips tried to shift the focus of the system to full-fledged video games such as the ill-fated ''VideoGame/HotelMario'' and the ''[[VideoGame/TheLegendOfZeldaCDiGames Zelda]]'' CD-i trilogy, these games already looked dated and primitive compared to games such as ''VideoGame/DonkeyKongCountry'' and ''VideoGame/StarFox''. Any possible opportunities to rectify this via hardware revisions were stonewalled by Philips' CD-i Green Book standard (which was both a data specification ''and'' a hardware specification), who wouldn't budge on system specs, prioritizing wide compatibility over keeping their hardware up to date. This doomed the CD-i as a video game platform, especially as it was being supplanted by more powerful machines such as the [[UsefulNotes/ThreeDOInteractiveMultiplayer [[Platform/ThreeDOInteractiveMultiplayer 3DO]] (which could do everything the CD-i could do except better), the UsefulNotes/SegaSaturn, Platform/SegaSaturn, the UsefulNotes/Nintendo64, Platform/Nintendo64, and most damningly, the Creator/{{Sony}} UsefulNotes/PlayStation.Platform/PlayStation. Video games aside, this also ensured that the CD-i could not keep up with the [[TechnologyMarchesOn rapidly evolving technology of the era]], leaving it in the dust of more capable media appliances such as the DVD player.
* The Nuon was a hardware standard for DVD players that enabled [=3D=] video games to be played on the system in addition to offering enhanced DVD playback features. While it was intended as a competitor to contemporary sixth-generation video game consoles such as the UsefulNotes/PlayStation2, UsefulNotes/GameCube, Platform/PlayStation2, Platform/GameCube, and UsefulNotes/{{Xbox}}, Platform/{{Xbox}}, its hardware was woefully underpowered for the era, barely outperforming the Nintendo 64. This and the fact that Nuon circuitry was offered on so little models ensured that the format was dead on arrival.
Is there an issue? Send a MessageReason:
None


** While the UsefulNotes/Atari5200 wasn't that poorly-designed of a system in general -- at worst, its absurdly huge size and power/RF combo switchbox could be annoying to deal with, but Atari eventually did away with the latter and were working on a smaller revision when [[UsefulNotes/TheGreatVideoGameCrashOf1983 the market crashed]], forcing them to discontinue the system -- its controllers were a different matter entirely. In many ways they were ahead of their time, with analogue movement along with start and pause buttons. Unfortunately, Atari cheaped out and didn't bother providing them with an auto-centring mechanism, along with building them out of such cheap materials that they usually tended to fail after a few months, if not ''weeks''. The poor-quality controllers subsequently played a major part in dooming the system.

to:

** While the UsefulNotes/Atari5200 Platform/Atari5200 wasn't that poorly-designed of a system in general -- at worst, its absurdly huge size and power/RF combo switchbox could be annoying to deal with, but Atari eventually did away with the latter and were working on a smaller revision when [[UsefulNotes/TheGreatVideoGameCrashOf1983 the market crashed]], forcing them to discontinue the system -- its controllers were a different matter entirely. In many ways they were ahead of their time, with analogue movement along with start and pause buttons. Unfortunately, Atari cheaped out and didn't bother providing them with an auto-centring mechanism, along with building them out of such cheap materials that they usually tended to fail after a few months, if not ''weeks''. The poor-quality controllers subsequently played a major part in dooming the system.
Is there an issue? Send a MessageReason:
None


* The UsefulNotes/SuperNintendoEntertainmentSystem was an overall improvement over the NES, doing away with the front-loading cartridge slot altogether and using better-quality pin connectors. That being said, there's still one notable flaw in the system in that the power plug on the back of the system is prone to breaking. Whereas the inner barrel of the power plug on the NES is made of metal, the SNES, on the other hand, uses a plastic barrel instead. This results in a more fragile power port in which the inner barrel breaks off from the stress of repeatedly plugging and unplugging the power cord from the system end.

to:

* The UsefulNotes/SuperNintendoEntertainmentSystem was an overall improvement over the NES, doing away with the front-loading cartridge slot altogether and using better-quality pin connectors. That being said, there's still one notable flaw in the system in that the power plug on the back of the system is prone to breaking. Whereas the inner barrel of the power plug on the NES is made of metal, the SNES, on the other hand, uses a plastic barrel instead. This results in a more fragile power port in which the inner barrel breaks off from the stress of repeatedly plugging and unplugging the power cord from the system end. It's not uncommon to see used consoles with broken AC In plugs. Thankfully you can buy replacements, but you will need to know how to solder to replace the piece.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

[[quoteright:335:https://static.tvtropes.org/pmwiki/pub/images/1eab8c5a07a21d085abb7311a2.jpg]]
[[caption-width-right:350:[[SelfDemonstratingArticle There's nothing about this that's good.]]]]
%% 15 WIDTH DIFFERENCE IS INTENTIONAL DESIGN
Is there an issue? Send a MessageReason:
Just For Pun has been disambiguated.


* ''VideoGame/PowerGigRiseOfTheSixString'' was an attempt to [[FollowTheLeader copy the success]] of ''VideoGame/GuitarHero'' and ''VideoGame/RockBand''. However, the game's poorly-designed peripherals, among other issues, caused it to fail and be mostly forgotten. Of note is its drum kit, which attempted to solve one of the main problems with its competitors' drums: playing drums in these games is very noisy, which makes the game impractical when living with other people or in an apartment. ''Power Gig''[='s=] "[=AirStrike=]" drum kit gets around this by not having you hit anything: instead of hitting physical drum pads, you swing specially-made drumsticks above motion sensors, allowing you to drum silently. The downside is that since you're not hitting anything, it's hard to tell where you're supposed to swing, and whether that note you missed was because you didn't hit the right pad or the drum just failed to detect your movement. The lack of feedback made using these drums more frustrating than fun. [[https://www.engadget.com/2010/06/08/hands-on-power-gigs-airstrike-drum/ Engadget's hands-on preview]] had few positives to say about it, which is particularly notable considering how such articles are meant to [[JustForPun drum up]] hype for the game.

to:

* ''VideoGame/PowerGigRiseOfTheSixString'' was an attempt to [[FollowTheLeader copy the success]] of ''VideoGame/GuitarHero'' and ''VideoGame/RockBand''. However, the game's poorly-designed peripherals, among other issues, caused it to fail and be mostly forgotten. Of note is its drum kit, which attempted to solve one of the main problems with its competitors' drums: playing drums in these games is very noisy, which makes the game impractical when living with other people or in an apartment. ''Power Gig''[='s=] "[=AirStrike=]" drum kit gets around this by not having you hit anything: instead of hitting physical drum pads, you swing specially-made drumsticks above motion sensors, allowing you to drum silently. The downside is that since you're not hitting anything, it's hard to tell where you're supposed to swing, and whether that note you missed was because you didn't hit the right pad or the drum just failed to detect your movement. The lack of feedback made using these drums more frustrating than fun. [[https://www.engadget.com/2010/06/08/hands-on-power-gigs-airstrike-drum/ Engadget's hands-on preview]] had few positives to say about it, which is particularly notable considering how such articles are meant to [[JustForPun [[{{Pun}} drum up]] hype for the game.

Added: 1874

Changed: 1683

Is there an issue? Send a MessageReason:
None


*** Things were turned up to eleven by the Atari Jaguar CD add-on. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the already low-quality disc motor to break apart internally from its fruitless attempts to spin the disc. Even inserting the Memory Track cartridge wrong can cause the main Jaguar unit to return a connection error, due to cartridges in the main console taking boot priority over [=CDs=] in the addon, so if the Memory Track cartridge isn't inserted properly it's simply read as a regular game cartridge that isn't inserted properly and cause a failure to boot. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective. And remember, this was an add-on for a console that only sold 125,000 units, making its very existence an example of idiot design. Only 25,000 were produced and nowhere near that amount were sold or even shipped (and only 13 games were ever released). Due to scarcity, Jaguar [=CDs=] go for ungodly amounts of money on auction sites, and due to the generally poor design, you're more likely to end up TrappedInAnotherWorld than you are to get a working unit.
*** Of note, it was not rare for the device to come fresh from the box in such a state of disrepair that ''highly trained specialists'' couldn't get it working - for example, it could be ''soldered directly to the cartridge port'' and '''still''' [[EpicFail display a connection error.]] This, by the way, is exactly what happened when Creator/JamesRolfe tried to review the system.

to:

*** Things were turned up to eleven by the Atari Jaguar CD add-on. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which [[WebVideo/TheAngryVideoGameNerd many]] [[WebVideo/TheSpoonyExperiment people]] have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Genesis which used the console's own weight to secure a good connection. \\
\\
Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the already low-quality disc motor to break apart internally from its fruitless attempts to spin the disc. Even inserting the Memory Track cartridge wrong can cause the main Jaguar unit to return a connection error, due to cartridges in the main console taking boot priority over [=CDs=] in the addon, so if the Memory Track cartridge isn't inserted properly it's simply read as a regular game cartridge that isn't inserted properly and cause a failure to boot. \\
\\
All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective. And remember, this was an add-on for a console that only sold 125,000 units, making its very existence an example of idiot design. Only 25,000 were produced and nowhere near that amount were sold or even shipped (and only 13 games were ever released). Due to scarcity, Jaguar [=CDs=] go for ungodly amounts of money on auction sites, and due to the generally poor design, you're more likely to end up TrappedInAnotherWorld than you are to get a working unit.
unit.
*** Of note, it was not rare for the device to come fresh from the box in such a state of disrepair that ''highly trained specialists'' couldn't get it working - for example, it could be ''soldered directly to the cartridge port'' and '''still''' [[EpicFail display a connection error.]] This, by the way, is exactly what happened when Creator/JamesRolfe tried to review the system.system, and while Noah Antwiler of The Spoony Experiment was able to get his defective unit working, it immediately died for good as soon as he finished recording footage for the review he used it in.
Is there an issue? Send a MessageReason:
Removed one letter too many.


** Infamously, the controllers were directly plugged onto the motherboard, requiring the console to be opened up in order to replace them. (Though no sodering is required, mercifully.) This would be bad enough, but the controller wires were also pathetically short (18 inches - the NES controller wires were three times that, for comparison) and plugged into the ''back'' of the console, meaning you pretty much had to play with the Famicom right next to you.

to:

** Infamously, the controllers were directly plugged onto the motherboard, requiring the console to be opened up in order to replace them. (Though no sodering soldering is required, mercifully.) This would be bad enough, but the controller wires were also pathetically short (18 inches - the NES controller wires were three times that, for comparison) and plugged into the ''back'' of the console, meaning you pretty much had to play with the Famicom right next to you.
Is there an issue? Send a MessageReason:
None


** Infamously, the controllers were directly plugged onto the motherboard, requiring the console to be opened up in order to replace them. (Though no soldiering is required, mercifully.) This would be bad enough, but the controller wires were also pathetically short (18 inches - the NES controller wires were three times that, for comparison) and plugged into the ''back'' of the console, meaning you pretty much had to play with the Famicom right next to you.

to:

** Infamously, the controllers were directly plugged onto the motherboard, requiring the console to be opened up in order to replace them. (Though no soldiering sodering is required, mercifully.) This would be bad enough, but the controller wires were also pathetically short (18 inches - the NES controller wires were three times that, for comparison) and plugged into the ''back'' of the console, meaning you pretty much had to play with the Famicom right next to you.

Added: 813

Changed: 2

Is there an issue? Send a MessageReason:
None


Additional, while the front-loader design of the original NES is well recognized and iconic, the VCR-like design isn't very helpful for seating cartridges firmly into the reader without causing wear and tear on the pins which leads to the connector pins being bent out of shape over time. This was not helped by the choice of brass-plated nickel connectors that are prone to oxidation and therefore require cleaning.

to:

Additional, Additionally, while the front-loader design of the original NES is well recognized and iconic, the VCR-like design isn't very helpful for seating cartridges firmly into the reader without causing wear and tear on the pins which leads to the connector pins being bent out of shape over time. This was not helped by the choice of brass-plated nickel connectors that are prone to oxidation and therefore require cleaning.
* That said, the Famicom and its peripherals weren't exactly free from their own design flaws either:
** Infamously, the controllers were directly plugged onto the motherboard, requiring the console to be opened up in order to replace them. (Though no soldiering is required, mercifully.) This would be bad enough, but the controller wires were also pathetically short (18 inches - the NES controller wires were three times that, for comparison) and plugged into the ''back'' of the console, meaning you pretty much had to play with the Famicom right next to you.
** The majority of the Disk System's "Disk Cards" (effectively a proprietary form of floppy disk) had no protective shutter on the access window, meaning careless or clumsy users could accidentally touch the disk surface and wipe out sections of data.
Is there an issue? Send a MessageReason:
Turns out it was actually a defective product. Oops.


* Most of the DX toyline created by Bandai Namco is often well received for having versatile features on their toys while mimicking the sounds and actions from their respective TV series. However, one of the post-series DX toys for ''Series/KamenRiderGeats'', the Shinobi Raise Buckle for some reason, instead of using a pin for toys that had a significant amount of sounds coming from the Raise Buckle, instead used the ''[[DidntThinkThisThrough same pin as the Ninja Raise Buckle]]''. [[https://youtu.be/ZlLohr45XH8?si=-iU8-jmgEfzhiuoW&t=140 The results are just as puzzling as you expect]].
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Most of the DX toyline created by Bandai Namco is often well received for having versatile features on their toys while mimicking the sounds and actions from their respective TV series. However, one of the post-series DX toys for ''Series/KamenRiderGeats'', the Shinobi Raise Buckle for some reason, instead of using a pin for toys that had a significant amount of sounds coming from the Raise Buckle, instead used the ''[[DidntThinkThisThrough same pin as the Ninja Raise Buckle]]''. [[https://youtu.be/ZlLohr45XH8?si=-iU8-jmgEfzhiuoW&t=140 The results are just as puzzling as you expect]].
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* The Ayaneo 2s is one of the many Steam Deck clones that has been made recently, however it has one glaring problem. When the device heats up (which it often does when gaming,) not only does it become almost too hot to touch, the excessive heat can also cause light bleed issues with the LCD display, most likely due to the heat softening the glue that holds the diffusion layer in place. A more in-depth review, as well as a more thorough demonstration of the issue can be found [[https://youtu.be/2opqHtnn4Nk?si=7UBLZXNEPIlUKkcy&t=1512 here]]
** The Asus ROG Ally also has another issue related to excess heat. This time the SD card reader can, when the device is under load, slow transfer speeds to a crawl, that is if the card is recognized at all. [[https://shorturl.at/gyABI Asus acknowledged the issue]] and quickly released a software update to tweak fan speeds to hopefully alleviate the issue.
Is there an issue? Send a MessageReason:
None


* The original cabinet for the first ''VideoGame/StreetFighterI'' had two giant pressure sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them. Putting aside the stiffness of the combat itself, the fact that the game all but encouraged hitting buttons as hard as possible meant that more often than not, players ended up damaging the buttons and the machine, making the cabinet an absolute maintenance nightmare. But even if you avoided literally breaking the game, repeatedly having to slam your fist down on big rubber buttons was a good way to make it very sore very quickly, and occasionally, players would end up missing them and damaging their hands on the cabinet. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new and cheaper cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.

to:

* The original cabinet for the first ''VideoGame/StreetFighterI'' had [[PressureSensitiveInterface two giant pressure sensitive pressure-sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them.them]]. Putting aside the stiffness of the combat itself, the fact that the game all but encouraged hitting buttons as hard as possible meant that more often than not, players ended up damaging the buttons and the machine, making the cabinet an absolute maintenance nightmare. But even if you avoided literally breaking the game, repeatedly having to slam your fist down on big rubber buttons was a good way to make it very sore very quickly, and occasionally, players would end up missing them and damaging their hands on the cabinet. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new and cheaper cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.

Added: 413

Changed: 3670

Is there an issue? Send a MessageReason:
None


** While the UsefulNotes/WiiU ultimately proved a failure for several reasons, poor component choices helped contribute to its near-total lack of third-party support. It'd only be a slight exaggeration to say that the system's CPU was just three heavily overclocked Wii [=CPUs=] -- the Wii's own CPU in turn being just a higher-clocked version of the CPU that had been used a decade earlier in the UsefulNotes/NintendoGameCube -- slapped together on the same die, with performance that was abysmally poor by 2012 standards. Its GPU, while not ''as'' slow, wasn't all that much faster than those of the [=PS3=] and Xbox 360,[[note]]in fact, on paper the Wii U's GPU was actually ''slower'' than the [=GPUs=] of either of those consoles, but due to advances in technology was about twice as fast in practice... unless you offloaded code to the GPU to offset the lack of CPU grunt, which would bring it back down to, if not below, [=PS3/360=] levels[[/note]] and used a shader model in-between those of the older consoles and their successors, meaning that ported [=PS3=]/360 games didn't take advantage of the newer hardware, while games designed for the [=PS4=] and Xbox One wouldn't even work to begin with due to the lack of necessary feature support. While Nintendo likely stuck with the [=PowerPC=] architecture for UsefulNotes/BackwardsCompatibility reasons, the system would likely have fared much better if Nintendo had just grabbed an off-the-shelf AMD laptop APU - which had enough power even in 2012 to brute-force emulate the Wii, eliminating the main reason to keep with the [=PowerPC=] line - stuffed it into a Wii case and called it a day. Fortunately, Nintendo seems to have learned from this, basing the UsefulNotes/NintendoSwitch on an existing [=nVidia=] mobile chip which thus far has proven surprisingly capable of punching above its weight.
* Speaking of the Nintendo Switch...

to:

* UsefulNotes/WiiU:
** While the UsefulNotes/WiiU console ultimately proved a failure for several reasons, poor component choices helped contribute to its near-total lack of third-party support. It'd only be a slight exaggeration to say that the system's CPU was just three heavily overclocked Wii [=CPUs=] -- the Wii's own CPU in turn being just a higher-clocked version of the CPU that had been used a decade earlier in the UsefulNotes/NintendoGameCube -- slapped together on the same die, with performance that was abysmally poor by 2012 standards. Its GPU, while not ''as'' slow, wasn't all that much faster than those of the [=PS3=] and Xbox 360,[[note]]in fact, on paper the Wii U's GPU was actually ''slower'' than the [=GPUs=] of either of those consoles, but due to advances in technology was about twice as fast in practice... unless you offloaded code to the GPU to offset the lack of CPU grunt, which would bring it back down to, if not below, [=PS3/360=] levels[[/note]] and used a shader model in-between those of the older consoles and their successors, meaning that ported [=PS3=]/360 games didn't take advantage of the newer hardware, while games designed for the [=PS4=] and Xbox One wouldn't even work to begin with due to the lack of necessary feature support. While Nintendo likely stuck with the [=PowerPC=] architecture for UsefulNotes/BackwardsCompatibility reasons, the system would likely have fared much better if Nintendo had just grabbed an off-the-shelf AMD laptop APU - which had enough power even in 2012 to brute-force emulate the Wii, eliminating the main reason to keep with the [=PowerPC=] line - stuffed it into a Wii case and called it a day. Fortunately, Nintendo seems to have learned from this, basing the UsefulNotes/NintendoSwitch on an existing [=nVidia=] mobile chip which thus far has proven surprisingly capable of punching above its weight.
* Speaking ** The console came with a paltry amount of on-board storage: 32GB on the Nintendo Switch...Deluxe models, and a mere 8GB for the Basic ones. While this could be expanded with an external hard-drive, the USB ports on the console don't put out enough power by themselves to power a hard-drive, requiring either one that can be powered externally, or a Y-splitter cable that has two USB plugs at one end.
* UsefulNotes/NintendoSwitch:
Is there an issue? Send a MessageReason:
None


* The original cabinet for the first ''VideoGame/StreetFighterI'' had two giant pressure sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them. Putting aside the stiffness of the combat itself, the fact that the game all but encouraged hitting buttons as hard as possible meant that more often than not, players ended up damaging the buttons and the machine, making the cabinet an absolute maintenance nightmare. But even if you avoided literally breaking the game, repeatedly having to slam your fist down on big rubber buttons was a good way to make it very sore very quickly, and occasionally, players would end up missing them and damaging their hands on the cabinet. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.

to:

* The original cabinet for the first ''VideoGame/StreetFighterI'' had two giant pressure sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them. Putting aside the stiffness of the combat itself, the fact that the game all but encouraged hitting buttons as hard as possible meant that more often than not, players ended up damaging the buttons and the machine, making the cabinet an absolute maintenance nightmare. But even if you avoided literally breaking the game, repeatedly having to slam your fist down on big rubber buttons was a good way to make it very sore very quickly, and occasionally, players would end up missing them and damaging their hands on the cabinet. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new and cheaper cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.
Is there an issue? Send a MessageReason:
None


* The original cabinet for the first ''VideoGame/StreetFighterI'' had two giant pressure sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them. Putting aside the stiffness of the combat itself, the fact that the game encouraged hitting buttons as hard as possible meant that more of then than not, players ended up damaging the buttons and the machine, making the cabinet an absolute maintenance nightmare. But even if you avoided literally breaking the game, repeatedly having to slam your fist down on big rubber buttons was a good way to make it very sore very quickly, and occasionally, players would end up missing them and damaging their hands on the cabinet. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.

to:

* The original cabinet for the first ''VideoGame/StreetFighterI'' had two giant pressure sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them. Putting aside the stiffness of the combat itself, the fact that the game all but encouraged hitting buttons as hard as possible meant that more of then often than not, players ended up damaging the buttons and the machine, making the cabinet an absolute maintenance nightmare. But even if you avoided literally breaking the game, repeatedly having to slam your fist down on big rubber buttons was a good way to make it very sore very quickly, and occasionally, players would end up missing them and damaging their hands on the cabinet. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.
Is there an issue? Send a MessageReason:
None


* The original cabinet for the first ''VideoGame/StreetFighterI'' had two giant pressure sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them. Putting aside the stiffness of the combat itself, repeatedly having to slam your fist down on big rubber buttons was a good way to make it very sore very quickly. And that was just if you were consistently hitting the buttons; more often than not, players would end up missing them and damaging their hands, or the machine, or both, making the cabinet an absolute maintenance nightmare. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.

to:

* The original cabinet for the first ''VideoGame/StreetFighterI'' had two giant pressure sensitive buttons for punches and kicks, with the strength of your attacks increasing the harder you slammed down on them. Putting aside the stiffness of the combat itself, the fact that the game encouraged hitting buttons as hard as possible meant that more of then than not, players ended up damaging the buttons and the machine, making the cabinet an absolute maintenance nightmare. But even if you avoided literally breaking the game, repeatedly having to slam your fist down on big rubber buttons was a good way to make it very sore very quickly. And that was just if you were consistently hitting the buttons; more often than not, quickly, and occasionally, players would end up missing them and damaging their hands, or hands on the machine, or both, making the cabinet an absolute maintenance nightmare.cabinet. Needless to say, the two-button cabinet flopped, and Creator/{{Capcom}} quickly commissioned a new cabinet design that replaced the two big rubber buttons with six smaller plastic buttons, with two rows for light, medium and heavy attacks. Not only did the new cabinet significantly outsell the original and make the game profitable, but the six-button control scheme would become the standard for the rest of the series, especially after it was paired with ''VideoGame/StreetFighterII''’s much more refined combat.
Is there an issue? Send a MessageReason:
None


* The IBM Deskstar 75GXP, nicknamed the [[FanNickname Death Star]]. While it was a large drive at the time (2000), it had a disturbing habit of suddenly failing, taking your data with it. The magnetic coating was of subpar reliability and came loose easily, causing head crashing that easily strips the magnetic layer off clean. One user with a RAID server setup reported to their RAID controller manufacturer; supposedly, this user was replacing their IBM Deskstars [[http://s3.computerhistory.org/groups/ds-ibm-75gxp-family-20121031.pdf at a rate of 600-800 drives per day]]. There have been many hard drives that have been criticized for various reasons, but the "Death Star" was something truly spectacular for all the wrong reasons.

to:

* The IBM Deskstar 75GXP, nicknamed the [[FanNickname Death Star]]. nicknamed]] the [[Film/ANewHope "Deathstar"]]. While it was a large drive at the time (2000), (ranging from 15 to 75 gigabytes in 2000), it had a disturbing habit of suddenly failing, taking your data with it. The magnetic coating was of subpar reliability and came loose easily, causing head crashing that easily strips the magnetic layer off clean. One user with a RAID server setup reported to their RAID controller manufacturer; supposedly, this user was replacing their IBM Deskstars [[http://s3.computerhistory.org/groups/ds-ibm-75gxp-family-20121031.pdf at a rate of 600-800 drives per day]]. There have been many hard drives that have been criticized for various reasons, but the "Death Star" was something truly spectacular for all the wrong reasons.



After a class-action lawsuit in 1998, Iomega issued a free replacement program , and further rebates in 2001 for future products. It was too little, too late, and CD-R disks were now more popular for mass storage and perceived as more reliable. [[http://www.pcworld.co.nz/article/157888/25_worst_tech_products_all_time/ The New Zealand site for PC World has the original article still available.]] Surprisingly however, Iomega would soldier on making two more generations of ZIP disk drives before leaving the market in the late 2000s.

to:

After a class-action lawsuit in 1998, Iomega issued a free replacement program , program, and further rebates in 2001 for future products. It was too little, too late, and CD-R disks were now more popular for mass storage and perceived as more reliable. [[http://www.pcworld.co.nz/article/157888/25_worst_tech_products_all_time/ The New Zealand site for PC World has the original article still available.]] Surprisingly however, Iomega would soldier on making two more generations of ZIP disk drives before leaving the market in the late 2000s.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* Class D cargo holds were, simply put, a terrible idea from the very beginning. The concept seemingly made sense: it was airtight and would cause any fire to exhaust all the available oxygen, so no need for pesky fire detection or suppression equipment, which would create extra wiring and be more expensive! What would happen if a fire was caused by an ''[[FailsafeFailure oxidizer]]'' obviously never occurred to anyone. That was, until 1996, when [=ValuJet=]'s Miami maintenance contractor labeled several boxes of expired chemical oxygen generators "Oxy Cannister - EMPTY", resulting in [=ValuJet=] baggage handlers, unaware of what they actually were, loading them onto a flight to Atlanta and that flight [[https://en.wikipedia.org/wiki/ValuJet_Flight_592 catching fire and crashing into the Everglades]]. The pilots were only alerted by equipment failure and a MassOhCrap in the cabin when flames started to breach the floor, and by that point, the 110 people on that flight were already screwed. In 1998, the FAA, fresh off being chewed out by government officials, NTSB investigators, and victims' family members for not implementing recommendations from previous Class D cargo fires, gave airlines three years to convert all Class D cargo holds to Class C or E.

Top