Follow TV Tropes

Following

History MediaNotes / HowVideoGameSpecsWork

Go To

OR

Is there an issue? Send a MessageReason:
None


** UsefulNotes/PhysicsEngine

to:

** UsefulNotes/PhysicsEngineMediaNotes/PhysicsEngine
Is there an issue? Send a MessageReason:
None


* UsefulNotes/ProgrammingLanguage
** UsefulNotes/{{Python}}
** [[UsefulNotes/JavaScript [=JavaScript=]]]

to:

* UsefulNotes/ProgrammingLanguage
MediaNotes/ProgrammingLanguage
** UsefulNotes/{{Python}}
MediaNotes/{{Python}}
** [[UsefulNotes/JavaScript [[MediaNotes/JavaScript [=JavaScript=]]]



** UsefulNotes/TheCLanguage
** UsefulNOtes/{{Ruby}}
** UsefulNotes/{{Rust}}
** UsefulNotes/{{Lua}}

to:

** UsefulNotes/TheCLanguage
MediaNotes/TheCLanguage
** UsefulNOtes/{{Ruby}}
MediaNotes/{{Ruby}}
** UsefulNotes/{{Rust}}
MediaNotes/{{Rust}}
** UsefulNotes/{{Lua}}MediaNotes/{{Lua}}



* UsefulNotes/GamingAudio
** UsefulNotes/{{MIDI}}
** UsefulNotes/{{MOD}}

to:

* UsefulNotes/GamingAudio
MediaNotes/GamingAudio
** UsefulNotes/{{MIDI}}
Platform/{{MIDI}}
** UsefulNotes/{{MOD}}Platform/{{MOD}}



** UsefulNotes/WavAudio

to:

** UsefulNotes/WavAudioPlatform/WavAudio
Is there an issue? Send a MessageReason:
None


** UsefulNotes/{{Unity}}

to:

** UsefulNotes/{{Unity}}MediaNotes/{{Unity}}
Is there an issue? Send a MessageReason:
None


** UsefulNotes/{{Java}}

to:

** UsefulNotes/{{Java}}MediaNotes/{{Java}}
Is there an issue? Send a MessageReason:
None


Or take processor speed. The Platform/SuperNintendo's 5A22 CPU had a UsefulNotes/ClockSpeed of 3.58 [=MHz=] (megahertz). The [[Platform/SegaGenesis Sega Genesis/Mega Drive]]'s 68000 CPU had a clock speed of 7.67 [=MHz=], just over twice the Super NES. So that meant that Super NES games should only run half as fast, right? Sega did have this advertising campaign of "[[http://segaretro.org/Blast_processing Blast Processing]]", and some early Super NES games did have slowdown. When Capcom ported ''VideoGame/StreetFighterII Turbo'' to the Super NES, it had a secret mode that was faster than the UsefulNotes/ArcadeGame and just as fast as the Genesis version, with little noticeable slowdown. The mode turned out to be more gimmick than playable, but it showed that the clock speed was only part of how the processor worked, and proper use of the system was what made the game run so fast. It did not necessarily mean the Super NES is just as fast as the Genesis, however, as evidenced by the faster scrolling speeds of Genesis games like ''Franchise/SonicTheHedgehog'' and the 3D polygon graphics of Genesis games like ''VideoGame/HardDrivin'' and ''VideoGame/StarCruiser'' not needing any UsefulNotes/{{Cartridge}} enhancement chips (which the Super NES required for 3D games like ''VideoGame/StarFox'').

to:

Or take processor speed. The Platform/SuperNintendo's 5A22 CPU had a UsefulNotes/ClockSpeed of 3.58 [=MHz=] (megahertz). The [[Platform/SegaGenesis Sega Genesis/Mega Drive]]'s 68000 CPU had a clock speed of 7.67 [=MHz=], just over twice the Super NES. So that meant that Super NES games should only run half as fast, right? Sega did have this advertising campaign of "[[http://segaretro.org/Blast_processing Blast Processing]]", and some early Super NES games did have slowdown. When Capcom ported ''VideoGame/StreetFighterII Turbo'' to the Super NES, it had a secret mode that was faster than the UsefulNotes/ArcadeGame and just as fast as the Genesis version, with little noticeable slowdown. The mode turned out to be more gimmick than playable, but it showed that the clock speed was only part of how the processor worked, and proper use of the system was what made the game run so fast. It did not necessarily mean the Super NES is just as fast as the Genesis, however, as evidenced by the faster scrolling speeds of Genesis games like ''Franchise/SonicTheHedgehog'' and the 3D polygon graphics of Genesis games like ''VideoGame/HardDrivin'' and ''VideoGame/StarCruiser'' not needing any UsefulNotes/{{Cartridge}} Platform/{{Cartridge}} enhancement chips (which the Super NES required for 3D games like ''VideoGame/StarFox'').



** UsefulNotes/MagneticDisk
** UsefulNotes/OpticalDisc
*** UsefulNotes/CompactDisc

to:

** UsefulNotes/MagneticDisk
Platform/MagneticDisk
** UsefulNotes/OpticalDisc
Platform/OpticalDisc
*** UsefulNotes/CompactDiscPlatform/CompactDisc



** UsefulNotes/{{Cartridge}}

to:

** UsefulNotes/{{Cartridge}}Platform/{{Cartridge}}



** UsefulNotes/{{MP3}}

to:

** UsefulNotes/{{MP3}}Platform/{{MP3}}
Is there an issue? Send a MessageReason:
None


*** UsefulNotes/{{DVD}}
*** UsefulNotes/BluRay

to:

*** UsefulNotes/{{DVD}}
Platform/{{DVD}}
*** UsefulNotes/BluRayPlatform/BluRay
Is there an issue? Send a MessageReason:
None


* UsefulNotes/AnalogVsDigital

to:

* UsefulNotes/AnalogVsDigitalMediaNotes/AnalogVsDigital
Is there an issue? Send a MessageReason:
None


* UsefulNotes/GameEngine

to:

* UsefulNotes/GameEngineMediaNotes/GameEngine
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* UsefulNotes/{{Emulation}}
Is there an issue? Send a MessageReason:
None


** UsefulNotes/UnrealEngine

to:

** UsefulNotes/UnrealEngineMediaNotes/UnrealEngine
Is there an issue? Send a MessageReason:
No need to link it twice


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the Platform/TurboGrafx16, which is, as evident in its name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to its Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the Platform/NeoGeo is a 24-bit machine (it's actually just 16-bit like the Platform/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that its custom GPU chipset was capable of). Even the Platform/Nintendo64, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the Platform/{{Dreamcast}}, Platform/PlayStation2, [[Platform/NintendoGameCube GameCube]] and Platform/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised [[Platform/{{Dreamcast}} Sega Dreamcast]][[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]. In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] If anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the Platform/Xbox360 as having a 360-bit CPU (it actually uses 64-bit hardware) or the Platform/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the Platform/TurboGrafx16, which is, as evident in its name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to its Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the Platform/NeoGeo is a 24-bit machine (it's actually just 16-bit like the Platform/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that its custom GPU chipset was capable of). Even the Platform/Nintendo64, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the Platform/{{Dreamcast}}, Platform/PlayStation2, [[Platform/NintendoGameCube GameCube]] and Platform/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised [[Platform/{{Dreamcast}} Sega Dreamcast]][[note]]Which Dreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]. In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] If anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the Platform/Xbox360 as having a 360-bit CPU (it actually uses 64-bit hardware) or the Platform/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in its name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to its Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the UsefulNotes/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that its custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]. In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] If anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, Platform/TurboGrafx16, which is, as evident in its name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to its Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the UsefulNotes/NeoGeo Platform/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- Platform/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that its custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, Platform/Nintendo64, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube Platform/{{Dreamcast}}, Platform/PlayStation2, [[Platform/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} Platform/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which [[Platform/{{Dreamcast}} Sega Dreamcast]][[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]. In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] If anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} Platform/Xbox360 as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 Platform/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.



Or take processor speed. The UsefulNotes/SuperNintendo's 5A22 CPU had a UsefulNotes/ClockSpeed of 3.58 [=MHz=] (megahertz). The [[UsefulNotes/SegaGenesis Sega Genesis/Mega Drive]]'s 68000 CPU had a clock speed of 7.67 [=MHz=], just over twice the Super NES. So that meant that Super NES games should only run half as fast, right? Sega did have this advertising campaign of "[[http://segaretro.org/Blast_processing Blast Processing]]", and some early Super NES games did have slowdown. When Capcom ported ''VideoGame/StreetFighterII Turbo'' to the Super NES, it had a secret mode that was faster than the UsefulNotes/ArcadeGame and just as fast as the Genesis version, with little noticeable slowdown. The mode turned out to be more gimmick than playable, but it showed that the clock speed was only part of how the processor worked, and proper use of the system was what made the game run so fast. It did not necessarily mean the Super NES is just as fast as the Genesis, however, as evidenced by the faster scrolling speeds of Genesis games like ''Franchise/SonicTheHedgehog'' and the 3D polygon graphics of Genesis games like ''VideoGame/HardDrivin'' and ''VideoGame/StarCruiser'' not needing any UsefulNotes/{{Cartridge}} enhancement chips (which the Super NES required for 3D games like ''VideoGame/StarFox'').

to:

Or take processor speed. The UsefulNotes/SuperNintendo's Platform/SuperNintendo's 5A22 CPU had a UsefulNotes/ClockSpeed of 3.58 [=MHz=] (megahertz). The [[UsefulNotes/SegaGenesis [[Platform/SegaGenesis Sega Genesis/Mega Drive]]'s 68000 CPU had a clock speed of 7.67 [=MHz=], just over twice the Super NES. So that meant that Super NES games should only run half as fast, right? Sega did have this advertising campaign of "[[http://segaretro.org/Blast_processing Blast Processing]]", and some early Super NES games did have slowdown. When Capcom ported ''VideoGame/StreetFighterII Turbo'' to the Super NES, it had a secret mode that was faster than the UsefulNotes/ArcadeGame and just as fast as the Genesis version, with little noticeable slowdown. The mode turned out to be more gimmick than playable, but it showed that the clock speed was only part of how the processor worked, and proper use of the system was what made the game run so fast. It did not necessarily mean the Super NES is just as fast as the Genesis, however, as evidenced by the faster scrolling speeds of Genesis games like ''Franchise/SonicTheHedgehog'' and the 3D polygon graphics of Genesis games like ''VideoGame/HardDrivin'' and ''VideoGame/StarCruiser'' not needing any UsefulNotes/{{Cartridge}} enhancement chips (which the Super NES required for 3D games like ''VideoGame/StarFox'').



So far, we've only discussed the UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit). This misadvertising is by no means limited to consoles, as GPU manufacturers for [=PCs=] and Macs of the era also play up to the hype, often claiming that more bits equals more vibrant colors or more realistic graphics compared to their lesser competitors.

The most infamous example of such a misunderstanding is the "[[http://segaretro.org/Blast_processing Blast Processing]]" campaign of the UsefulNotes/SegaGenesis. Many erroneously assumed the term was referring to its CPU's faster clock speed, but it was in fact referring to its GPU's faster [[https://en.wikipedia.org/wiki/Direct_memory_access DMA]] (direct memory access). The Genesis GPU had a 13 [=MHz=] clock speed and 8 [=MHz=] bus speed, whereas the Super NES GPU had a 5.3 [=MHz=] clock speed and 3.5 [=MHz=] bus speed. This allowed the Genesis GPU to have a faster DMA write speed of 3.2–6.4 MB/s, compared to the Super Nintendo's 2.6 MB/s DMA write speed. This was what ''really'' gave the Genesis its faster performance, not just its CPU's higher clock speed.

to:

So far, we've only discussed the UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, Platform/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo Platform/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit). This misadvertising is by no means limited to consoles, as GPU manufacturers for [=PCs=] and Macs of the era also play up to the hype, often claiming that more bits equals more vibrant colors or more realistic graphics compared to their lesser competitors.

The most infamous example of such a misunderstanding is the "[[http://segaretro.org/Blast_processing Blast Processing]]" campaign of the UsefulNotes/SegaGenesis.Platform/SegaGenesis. Many erroneously assumed the term was referring to its CPU's faster clock speed, but it was in fact referring to its GPU's faster [[https://en.wikipedia.org/wiki/Direct_memory_access DMA]] (direct memory access). The Genesis GPU had a 13 [=MHz=] clock speed and 8 [=MHz=] bus speed, whereas the Super NES GPU had a 5.3 [=MHz=] clock speed and 3.5 [=MHz=] bus speed. This allowed the Genesis GPU to have a faster DMA write speed of 3.2–6.4 MB/s, compared to the Super Nintendo's 2.6 MB/s DMA write speed. This was what ''really'' gave the Genesis its faster performance, not just its CPU's higher clock speed.



** UsefulNotes/MacOS
** UsefulNotes/MicrosoftWindows

to:

** UsefulNotes/MacOS
Platform/MacOS
** UsefulNotes/MicrosoftWindowsPlatform/MicrosoftWindows



** UsefulNotes/{{Symbian}}
** UsefulNotes/{{UNIX}}

to:

** UsefulNotes/{{Symbian}}
Platform/{{Symbian}}
** UsefulNotes/{{UNIX}}Platform/{{UNIX}}
Is there an issue? Send a MessageReason:
Null edit
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* UsefulNotes/BackwardsCompatibility
Is there an issue? Send a MessageReason:
None



to:

* UsefulNotes/UniversalSerialBus
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** UsefulNotes/UnrealEngine
Is there an issue? Send a MessageReason:
Adding Scumm VM

Added DiffLines:

** UsefulNotes/ScummVM
Is there an issue? Send a MessageReason:
Fix a few wrong "it's"


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the UsefulNotes/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that it's custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]. In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] If anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's its name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s its Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the UsefulNotes/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that it's its custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]. In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] If anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the UsefulNotes/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that it's custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] If anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the UsefulNotes/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that it's custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]).screen]]. In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] If anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the UsefulNotes/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that it's custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the UsefulNotes/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that it's custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if If anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the UsefulNotes/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that it's custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, game companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors), or SNK claiming that the UsefulNotes/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that it's custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, game console companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, game companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors).colors), or SNK claiming that the UsefulNotes/NeoGeo is a 24-bit machine (it's actually just 16-bit like the UsefulNotes/SegaGenesis- in fact, it's the exact same CPU. The 24-bit part refers to the color depth that it's custom GPU chipset was capable of). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, game companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, game companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the PS3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, game companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware) or the PS3 UsefulNotes/PlayStation3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32 bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, game companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware).

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32 bit 32-bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, game companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up today, at the start of the [=PS360=] era, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware).
hardware) or the PS3 as having a 256-bit CPU (again, 64-bit hardware), this completely died down shortly after.
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware).

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] if anything, it was Microsoft who were honest by upfront admitting that the CPU in the Xbox is 32 bit just like in [=PCs=], and that more bits generally do not translate to better graphics and better gameplay, instead playing up on the Xbox's exclusive titles and ease of porting Windows games to the platform.[[note]]While some fanboys of competing platforms initially derided the Xbox for being "two gens behind", the exclusives quickly won a fan base over, proving that yes, it's the games that count. And of course, later, game companies stopped this marketing tactic anyway[[/note]]. Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware).
Is there an issue? Send a MessageReason:
None

Added DiffLines:

* UsefulNotes/PerformanceBottleneck
Is there an issue? Send a MessageReason:
None


** UsefulNotes/GameMaker

to:

** UsefulNotes/GameMakerGameMaker
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** UsefulNotes/GameMaker

Removed: 38

Is there an issue? Send a MessageReason:
None


** UsefulNotes/ASideNoteAboutBranching

Added: 38

Changed: 37

Is there an issue? Send a MessageReason:
On second thought, this goes here.


* UsefulNotes/ASideNoteAboutBranching

to:

* UsefulNotes/ASideNoteAboutBranching


Added DiffLines:

** UsefulNotes/ASideNoteAboutBranching

Top