Follow TV Tropes

Following

History DarthWiki / IdiotDesign

Go To

OR

Is there an issue? Send a MessageReason:
None


* The Samsung Galaxy Note 7, released in August 2016 and discontinued just two months later. On the surface, it was a great cell phone that competed with any number of comparable phablets. The problem? [[http://gizmodo.com/how-a-boring-iphone-7-put-samsung-on-the-path-to-self-d-1786786206 It was rushed to market]] to beat Apple's upcoming iPhone 7 (which came out the following month), and this left it with a serious problem: namely, that it [[MadeOfExplodium had a habit of spontaneously combusting]]. Samsung hastily recalled the phone in September once it started causing dozens of fires (to the point where aviation safety authorities were telling people not to bring them onto planes), and gave buyers replacements with batteries from a different supplier. When ''those'' phones started catching fire as well, it became obvious that the problems had nothing to do with quality control and ran to the heart of the phone's design [[note]](the phone's power draw was greater than any battery that size can handle, leading them to overvolt, overheat, and experience a catastrophic chemical reaction that lithium-ion batteries are prone to)[[/note]]. By the time that Samsung [[http://www.theverge.com/2016/10/11/13202608/samsung-galaxy-note-7-discontinued discontinued the Galaxy Note 7]], it had already become the [[EveryCarIsAPinto Ford Pinto]] of smartphones and a worldwide joke, with every major wireless carrier in the US having already pulled them from sale. Samsung especially [[BuryYourArt doesn't want to be reminded of it]], to the point that they ordered Website/YouTube to take down any video showing a mod for ''VideoGame/GrandTheftAutoV'' that reskins the Sticky Bombs into the Galaxy Note 7.
* The Google Nexus 6P, made by Huawei, has multiple major design flaws that make accidental damage from sitting on it potentially catastrophic: the back isn't screwed into the structure of the phone even though it's not intended to be removable, leaving the thin aluminum on the sides and a slab of Gorilla Glass 4 not designed for structural integrity to hold the phone together; there's a gap between the battery and motherboard right by the power button that creates a weak point; and the plastic and metal are held together with dovetail joints, which are intended for ''woodworking''. [[note]]Dovetail joints ''are'' very strong, and are also used in non-wood applications like attaching blades to rotors in jet engines, or joining sliding elements in metalworking presses and machine tools, but that strength is dependent of the material used; plastic is obviously a bad idea.[[/note]] Zack Nelson, testing the phone for his Website/YouTube channel [=JerryRigEverything=], was able to [[https://www.youtube.com/watch?v=tTIaUH6PIvo destroy]] [[https://www.youtube.com/watch?v=r3cWVdLqXCg both]] Nexus 6Ps he tested immediately.

to:

* The Samsung Galaxy Note 7, released in August 2016 and discontinued just two months later. On the surface, it was a great cell phone that competed with any number of comparable phablets. The problem? [[http://gizmodo.com/how-a-boring-iphone-7-put-samsung-on-the-path-to-self-d-1786786206 It was rushed to market]] to beat Apple's upcoming iPhone 7 (which came out the following month), and this left it with a serious problem: namely, that it [[MadeOfExplodium had a habit of spontaneously combusting]]. Samsung hastily recalled the phone in September once it started causing dozens of fires (to the point where aviation safety authorities were telling people not to bring them onto planes), and gave buyers replacements with batteries from a different supplier. When ''those'' phones started catching fire as well, it became obvious that the problems had nothing to do with quality control and ran to the heart of the phone's design [[note]](the phone's power draw was greater than any battery that size can handle, leading them to overvolt, overheat, and experience a catastrophic chemical reaction that lithium-ion batteries are prone to)[[/note]]. By the time that Samsung [[http://www.theverge.com/2016/10/11/13202608/samsung-galaxy-note-7-discontinued discontinued the Galaxy Note 7]], it had already become the [[EveryCarIsAPinto Ford Pinto]] of smartphones and a worldwide joke, with every major wireless carrier in the US having already pulled them from sale. Samsung especially [[BuryYourArt doesn't want to be reminded of it]], to the point that they ordered Website/YouTube Platform/YouTube to take down any video showing a mod for ''VideoGame/GrandTheftAutoV'' that reskins the Sticky Bombs into the Galaxy Note 7.
* The Google Nexus 6P, made by Huawei, has multiple major design flaws that make accidental damage from sitting on it potentially catastrophic: the back isn't screwed into the structure of the phone even though it's not intended to be removable, leaving the thin aluminum on the sides and a slab of Gorilla Glass 4 not designed for structural integrity to hold the phone together; there's a gap between the battery and motherboard right by the power button that creates a weak point; and the plastic and metal are held together with dovetail joints, which are intended for ''woodworking''. [[note]]Dovetail joints ''are'' very strong, and are also used in non-wood applications like attaching blades to rotors in jet engines, or joining sliding elements in metalworking presses and machine tools, but that strength is dependent of the material used; plastic is obviously a bad idea.[[/note]] Zack Nelson, testing the phone for his Website/YouTube [=YouTube=] channel [=JerryRigEverything=], was able to [[https://www.youtube.com/watch?v=tTIaUH6PIvo destroy]] [[https://www.youtube.com/watch?v=r3cWVdLqXCg both]] Nexus 6Ps he tested immediately.



* ''Website/{{Cracked}}'' has an article called "[[http://www.cracked.com/article_15764_5-least-surprising-toy-recalls-all-time.html The 5 Least Surprising Toy Recalls of All Time]]", listing variously [[MyLittlePanzer dangerous toys]]. Amongst them:

to:

* ''Website/{{Cracked}}'' ''Cracked'' has an article called "[[http://www.cracked.com/article_15764_5-least-surprising-toy-recalls-all-time.html The 5 Least Surprising Toy Recalls of All Time]]", listing variously [[MyLittlePanzer dangerous toys]]. Amongst them:



* Wanna watch dubbed anime on Website/{{Crunchyroll}}? Well, good luck with that. Aside from dubs being ''very'' scarce on the site[[note]]It seems that most dubs available on the site are for the more popular shows, or legacy content that already had dubs produced beforehand[[/note]], it doesn't do a good job of telling you what has a dub, and what doesn't. For example, ''Anime/DragonBallZ'' has ''only'' [[https://www.crunchyroll.com/series/G9VHN9PPW?utm_medium=android&utm_source=share the dub]] with no indication of it being that way, ''Manga/SlamDunk'''s dub [[http://www.crunchyroll.com/slam-dunk-dub is its own separate listing for some reason]], while most other shows have the dub hidden away as a separate season, which isn't readily apparent in the least. For comparison, other streaming services such as Netflix handle this the same way [=MKVs=] do. The viewer is given the option to swap between audio tracks on the fly through a menu, as well as letting them turn off subtitles save for instances where they're necessary, such as song lyrics or non-English text within the anime itself.


to:

* Wanna watch dubbed anime on Website/{{Crunchyroll}}? Platform/{{Crunchyroll}}? Well, good luck with that. Aside from dubs being ''very'' scarce on the site[[note]]It seems that most dubs available on the site are for the more popular shows, or legacy content that already had dubs produced beforehand[[/note]], it doesn't do a good job of telling you what has a dub, and what doesn't. For example, ''Anime/DragonBallZ'' has ''only'' [[https://www.crunchyroll.com/series/G9VHN9PPW?utm_medium=android&utm_source=share the dub]] with no indication of it being that way, ''Manga/SlamDunk'''s dub [[http://www.crunchyroll.com/slam-dunk-dub is its own separate listing for some reason]], while most other shows have the dub hidden away as a separate season, which isn't readily apparent in the least. For comparison, other streaming services such as Netflix handle this the same way [=MKVs=] do. The viewer is given the option to swap between audio tracks on the fly through a menu, as well as letting them turn off subtitles save for instances where they're necessary, such as song lyrics or non-English text within the anime itself.

Is there an issue? Send a MessageReason:
None


** Most of the 360's problems stem from the inexplicable decision to use a full-sized desktop DVD drive, which even in the larger original consoles took almost a quarter of their internal volume. Early models also had four rather large chips on the motherboard, due to the 90 nm manufacturing process, which also made them run quite hot (especially the [[UsefulNotes/GraphicsProcessingUnit GPU]]-[[UsefulNotes/VideoRAM VRAM]] combo that doubled as a northbridge). But the relative positions of the GPU and the drive (and the latter's bulk) meant that there simply wasn't any room to put any practical heatsink. Microsoft tried to address this problem in two separate motherboard redesigns, the first of which finally added at least ''some'' heatsink, but it was only a third, when the chipset was shrunk to just two components, which allowed designers to completely reshuffle the board and add a little fan atop the new, large heatsink, which finally did away with the problem somewhat. However, even the Slim version still uses that hugeass desktop DVD drive, which still has ''no'' support for the disk, perpetuating the scratching problem.

to:

** Most of the 360's problems stem from the inexplicable decision to use a full-sized desktop DVD drive, which even in the larger original consoles took almost a quarter of their internal volume. Early models also had four rather large chips on the motherboard, due to the 90 nm manufacturing process, which also made them run quite hot (especially the [[UsefulNotes/GraphicsProcessingUnit GPU]]-[[UsefulNotes/VideoRAM [[MediaNotes/GraphicsProcessingUnit GPU]]-[[MediaNotes/VideoRAM VRAM]] combo that doubled as a northbridge). But the relative positions of the GPU and the drive (and the latter's bulk) meant that there simply wasn't any room to put any practical heatsink. Microsoft tried to address this problem in two separate motherboard redesigns, the first of which finally added at least ''some'' heatsink, but it was only a third, when the chipset was shrunk to just two components, which allowed designers to completely reshuffle the board and add a little fan atop the new, large heatsink, which finally did away with the problem somewhat. However, even the Slim version still uses that hugeass desktop DVD drive, which still has ''no'' support for the disk, perpetuating the scratching problem.
Is there an issue? Send a MessageReason:
None


* The Platform/NintendoDS only supported WEP Wi-Fi encryption, in an era where the much more secure WPA standard was rising. Wanted to play online with a WPA router? You'd either need to set your router's encryption to WEP, disable your router's password entirely (both present security vulnerabilities), or buy a special dongle from Nintendo to plug into your computer and act as a "middleman" between your DS and your router, which was discontinued in 2007 due to a lawsuit towards co-designer Buffalo from the Austraulian government.

to:

* The Platform/NintendoDS only supported WEP Wi-Fi encryption, in an era where the much more secure WPA standard was rising. Wanted to play online with a WPA router? You'd either need to set your router's encryption to WEP, disable your router's password entirely (both present security vulnerabilities), or buy a special dongle from Nintendo to plug into your computer and act as a "middleman" between your DS and your router, which was discontinued in 2007 due to a lawsuit towards co-designer Buffalo from the Austraulian Australian government.
Is there an issue? Send a MessageReason:
None


* The [=GameCube=] was able to play games in 480p resolution if the game supported it. However, the [=GameCube=] itself didn't output an analog 480p signal. It would only output 480p through a ''digital'' signal, where it would be converted back into an analog signal through system's component cables, which contained a special DAC in the plug. Nintendo quietly discontinued production of the cables since less than 1% of consumers bought them and the cables were too expensive to make without recouping costs. Because the 480p signal can't be achieved without Nintendo's component cables, it was, for a time, impossible for a consumer to simply use third-party component cables. Luckily, the component cables' internal DAC was eventually reverse-engineered and cheaper aftermarket cables began hitting the market. Not only that, it was discovered that the digital signal coming from the [=GameCube=]'s Digital AV port was compatible with the HDMI standard and HDMI adapters began hitting the market as well, or you could just rip out the Digital AV port entirely and replace it with a regular HDMI port. In addition, Nintendo made the Wii (which plays [=GameCube=] discs) output the analog 480p signal directly from the console rather than processing a digital signal through the cables themselves (the component cables are still needed, albeit they're much cheaper), thus making the Wii a cheaper option to play [=GameCube=] games in 480p compared to buying the [=GameCube=] cables secondhand at a premium price.

to:

* The [=GameCube=] was able to play games in 480p resolution if the game supported it. However, the [=GameCube=] itself didn't output an analog 480p signal. It would only output 480p through a ''digital'' signal, where it would be converted back into an analog signal through system's component cables, which contained a special DAC in the plug. Nintendo quietly discontinued production of the cables since less than 1% of consumers bought them and the cables were too expensive to make without recouping costs. Because the 480p signal can't be achieved without Nintendo's component cables, it was, for a time, impossible for a consumer to simply use third-party component cables. What's especially baffling is the presence of a separate digital port for component in the first place, when the SNES illustrated it was possible to output high-quality RGB video directly from the existing analog port. Rumor has it that the digital port was intended for the scrapped stereoscopic 3D add-on, and the cancellation of the add-on led to the port being repurposed as a component cable port. Luckily, the component cables' internal DAC was eventually reverse-engineered and cheaper aftermarket cables began hitting the market. Not only that, it was discovered that the digital signal coming from the [=GameCube=]'s Digital AV port was compatible with the HDMI standard and HDMI adapters began hitting the market as well, or you could just rip out the Digital AV port entirely and replace it with a regular HDMI port. In addition, Nintendo made the Wii (which plays [=GameCube=] discs) output the analog 480p signal directly from the console rather than processing a digital signal through the cables themselves (the component cables are still needed, albeit they're much cheaper), thus making the Wii a cheaper option to play [=GameCube=] games in 480p compared to buying the [=GameCube=] cables secondhand at a premium price.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* The Platform/NintendoDS only supported WEP Wi-Fi encryption, in an era where the much more secure WPA standard was rising. Wanted to play online with a WPA router? You'd either need to set your router's encryption to WEP, disable your router's password entirely (both present security vulnerabilities), or buy a special dongle from Nintendo to plug into your computer and act as a "middleman" between your DS and your router, which was discontinued in 2007 due to a lawsuit towards co-designer Buffalo from the Austraulian government.
Is there an issue? Send a MessageReason:
The width difference is intentional. Don't fix it.


[[caption-width-right:335:[[SelfDemonstratingArticle There's nothing about this that's good.]]]]

to:

[[caption-width-right:335:[[SelfDemonstratingArticle [[caption-width-right:350:[[SelfDemonstratingArticle There's nothing about this that's good.]]]]
Is there an issue? Send a MessageReason:
None


[[caption-width-right:350:[[SelfDemonstratingArticle There's nothing about this that's good.]]]]

to:

[[caption-width-right:350:[[SelfDemonstratingArticle [[caption-width-right:335:[[SelfDemonstratingArticle There's nothing about this that's good.]]]]
Is there an issue? Send a MessageReason:
None


** While the console ultimately proved a failure for several reasons, poor component choices helped contribute to its near-total lack of third-party support. It'd only be a slight exaggeration to say that the system's CPU was just three heavily overclocked Wii [=CPUs=] -- the Wii's own CPU in turn being just a higher-clocked version of the CPU that had been used a decade earlier in the Platform/NintendoGameCube -- slapped together on the same die, with performance that was abysmally poor by 2012 standards. Its GPU, while not ''as'' slow, wasn't all that much faster than those of the [=PS3=] and Xbox 360,[[note]]in fact, on paper the Wii U's GPU was actually ''slower'' than the [=GPUs=] of either of those consoles, but due to advances in technology was about twice as fast in practice... unless you offloaded code to the GPU to offset the lack of CPU grunt, which would bring it back down to, if not below, [=PS3/360=] levels[[/note]] and used a shader model in-between those of the older consoles and their successors, meaning that ported [=PS3=]/360 games didn't take advantage of the newer hardware, while games designed for the [=PS4=] and Xbox One wouldn't even work to begin with due to the lack of necessary feature support. While Nintendo likely stuck with the [=PowerPC=] architecture for UsefulNotes/BackwardsCompatibility reasons, the system would likely have fared much better if Nintendo had just grabbed an off-the-shelf AMD laptop APU - which had enough power even in 2012 to brute-force emulate the Wii, eliminating the main reason to keep with the [=PowerPC=] line - stuffed it into a Wii case and called it a day. Fortunately, Nintendo seems to have learned from this, basing the Platform/NintendoSwitch on an existing [=nVidia=] mobile chip which thus far has proven surprisingly capable of punching above its weight.

to:

** While the console ultimately proved a failure for several reasons, poor component choices helped contribute to its near-total lack of third-party support. It'd only be a slight exaggeration to say that the system's CPU was just three heavily overclocked Wii [=CPUs=] -- the Wii's own CPU in turn being just a higher-clocked version of the CPU that had been used a decade earlier in the Platform/NintendoGameCube -- slapped together on the same die, with performance that was abysmally poor by 2012 standards. Its GPU, while not ''as'' slow, wasn't all that much faster than those of the [=PS3=] and Xbox 360,[[note]]in fact, on paper the Wii U's GPU was actually ''slower'' than the [=GPUs=] of either of those consoles, but due to advances in technology was about twice as fast in practice... unless you offloaded code to the GPU to offset the lack of CPU grunt, which would bring it back down to, if not below, [=PS3/360=] levels[[/note]] and used a shader model in-between those of the older consoles and their successors, meaning that ported [=PS3=]/360 games didn't take advantage of the newer hardware, while games designed for the [=PS4=] and Xbox One wouldn't even work to begin with due to the lack of necessary feature support. While Nintendo likely stuck with the [=PowerPC=] architecture for UsefulNotes/BackwardsCompatibility MediaNotes/BackwardsCompatibility reasons, the system would likely have fared much better if Nintendo had just grabbed an off-the-shelf AMD laptop APU - which had enough power even in 2012 to brute-force emulate the Wii, eliminating the main reason to keep with the [=PowerPC=] line - stuffed it into a Wii case and called it a day. Fortunately, Nintendo seems to have learned from this, basing the Platform/NintendoSwitch on an existing [=nVidia=] mobile chip which thus far has proven surprisingly capable of punching above its weight.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** Sega's most commercially successful console, the Platform/SegaGenesis, had its own share of mistakes:
*** The Genesis, unlike its chief rival the Platform/SuperNintendoEntertainmentSystem, was fully backwards compatible with the Platform/SegaMasterSystem. However, it wasn't ready to play Master System games out of the box. Instead of making the cartridge slot able to accept Master System games, you had to shell out extra cash for a separate Power Base Converter, basically a Master System cartridge adapter. Having to buy a separate attachment just to have access to your library of previous-gen games almost defeats the whole point of backwards compatibility in the first place. This alone is bad enough, but the way it was designed meant it wouldn't fit into the Genesis 2. Since the Master System was actually successful in Europe, they got a special adapter that was remolded to be compatible with the Genesis 2. American and Japanese markets weren't extended the same courtesy.
*** Bizarrely, the original incarnation of the Genesis, despite being able to output stereo sound, did not output it through the console's AV port, instead opting to output it through the console's front headphone jack. This meant having to buy a 3.5mm to stereo RCA converter and running it from the front of your console to the back of your TV or stereo system if you wanted stereo sound, not exactly an elegant solution. The Genesis 2 rectifies this problem by ditching the headphone jack and outputting stereo right out of the AV port, [[https://www.youtube.com/watch?v=tiu8wDLV_9g at the expense of a lower-quality sound chip.]]
Is there an issue? Send a MessageReason:


* Depending on what perspective you take on this, modern desktop hardware is approaching this in terms of the performance you get for the power they consume. For instance, [=CPUs=] from both Intel and AMD now hit the thermal limit first, allowing the CPU to draw much more power than previous designs ever did without deliberate mucking around in the firmware settings. Granted, for most scenarios power draw safety is mostly about averages rather than what's being pulled at that instant. However, it seems kind of silly that for most of these processors, about the last 5-10% performance they get requires at least another 20% more power, if that. A cursory search on the internet for articles on power limiting parts tends to show that people can reduce the power consumption on the part by about 20-30% but only lose 5-10% performance which tends to be only noticeable if it's being measured (i.e., you're not going to notice a performance drop in practice). Essentially: desktop parts are being designed like high-end sports cars.
Is there an issue? Send a MessageReason:
None


* Depending on what perspective you take on this, modern desktop hardware is approaching this in terms of the performance you get for the power they consume. For instance, [=CPUs=] from both Intel and AMD now hit the thermal limit first, allowing the CPU to draw much more power than previous designs ever did without deliberate mucking around in the firmware settings. Granted, for most scenarios power draw safety is mostly about averages rather than what's being pulled at that instant. However, it seems kind of silly that for most of these processors, about the last 5-10% performance they get requires at least another 20% more power, if that. A cursory search on the internet for articles on power limiting parts tends to show that people can reduce the power consumption on the part by about 20-30% but only lose 5-10% performance which tends to be only noticeable if it's being measured (i.e., you're not going to notice a performance drop in practice). Essentially: desktop power parts are being designed like high-end sports cars.

to:

* Depending on what perspective you take on this, modern desktop hardware is approaching this in terms of the performance you get for the power they consume. For instance, [=CPUs=] from both Intel and AMD now hit the thermal limit first, allowing the CPU to draw much more power than previous designs ever did without deliberate mucking around in the firmware settings. Granted, for most scenarios power draw safety is mostly about averages rather than what's being pulled at that instant. However, it seems kind of silly that for most of these processors, about the last 5-10% performance they get requires at least another 20% more power, if that. A cursory search on the internet for articles on power limiting parts tends to show that people can reduce the power consumption on the part by about 20-30% but only lose 5-10% performance which tends to be only noticeable if it's being measured (i.e., you're not going to notice a performance drop in practice). Essentially: desktop power parts are being designed like high-end sports cars. cars.

Top