Follow TV Tropes

Following

History MediaNotes / HowVideoGameSpecsWork

Go To

OR

Changed: 37

Is there an issue? Send a MessageReason:
None



to:

* UsefulNotes/ASideNoteAboutBranching
Is there an issue? Send a MessageReason:
None

Added DiffLines:

*** UsefulNotes/GraphicsAPI
Is there an issue? Send a MessageReason:
Moved per TRS.


* GameEngine

to:

* GameEngineUsefulNotes/GameEngine
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** UsefulNotes/{{Symbian}}

Added: 23

Changed: 10

Is there an issue? Send a MessageReason:
None


** UsefulNotes/{{Java}}



** UsefulNotes/{{Java}}

to:

** UsefulNotes/{{Java}}UsefulNOtes/{{Ruby}}

Added: 34

Removed: 35

Is there an issue? Send a MessageReason:
None


** UsefulNotes/JavaSoftwarePlatform


Added DiffLines:

* UsefulNotes/JavaSoftwarePlatform
Is there an issue? Send a MessageReason:
None


[[DescribeTopicHere Describe]] How Video Game Specs Work [[DescribeTopicHere here]], but in ''layman's terms'', please (like the ''Guinness World Records'', we're just here to head off hunting bets, not prep you for IT 101.) This isn't Wiki/TheOtherWiki.

Added: 45

Changed: 32

Removed: 23

Is there an issue? Send a MessageReason:
None


** UsefulNotes/ScriptingLanguage
*** UsefulNotes/{{Lua}}

to:

** UsefulNotes/ScriptingLanguage
*** UsefulNotes/{{Lua}}
[[UsefulNotes/JavaScript [=JavaScript=]]]


Added DiffLines:

** UsefulNotes/{{Rust}}
** UsefulNotes/{{Lua}}
Is there an issue? Send a MessageReason:
None


** UsefulNotes/Java

to:

** UsefulNotes/JavaUsefulNotes/{{Java}}
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** UsefulNotes/Java
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** UsefulNotes/{{Unity}}
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware).

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Even Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-bit CPU (it actually uses 64-bit hardware).
Is there an issue? Send a MessageReason:
just added to Flame Bait


[[DescribeTopicHere Describe]] How Video Game Specs Work [[DescribeTopicHere here]], but in ''layman's terms'', please (like the ''Guinness World Records'', we're just here to head off [[InternetBackdraft hunting bets]], not prep you for IT 101.) This isn't Wiki/TheOtherWiki.

to:

[[DescribeTopicHere Describe]] How Video Game Specs Work [[DescribeTopicHere here]], but in ''layman's terms'', please (like the ''Guinness World Records'', we're just here to head off [[InternetBackdraft hunting bets]], bets, not prep you for IT 101.) This isn't Wiki/TheOtherWiki.
Is there an issue? Send a MessageReason:


** PixelVsTexel

to:

** PixelVsTexelUsefulNotes/PixelVsTexel
Is there an issue? Send a MessageReason:
None


So far, we've only discussed the UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit). This misadvertising is by no means limited to consoles, as video card manufacturers for [=PCs=] and Macs of the era also play up to the hype, often claiming that more bits equals more vibrant colors or more realistic graphics compared to their lesser competitors.

to:

So far, we've only discussed the UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit). This misadvertising is by no means limited to consoles, as video card GPU manufacturers for [=PCs=] and Macs of the era also play up to the hype, often claiming that more bits equals more vibrant colors or more realistic graphics compared to their lesser competitors.
Is there an issue? Send a MessageReason:
None


So far, we've only discussed the UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit). This misadvertising is by no means limited to consoles, as video card manufacturers for [=PCs=] and Macs also play up to the hype, often claiming that more bits equals more vibrant colors or more realistic graphics compared to their competitors.

to:

So far, we've only discussed the UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit). This misadvertising is by no means limited to consoles, as video card manufacturers for [=PCs=] and Macs of the era also play up to the hype, often claiming that more bits equals more vibrant colors or more realistic graphics compared to their lesser competitors.
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-Bit CPU (it actually uses 64-Bit hardware).

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-Bit 360-bit CPU (it actually uses 64-Bit 64-bit hardware).
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the GameCube's CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-Bit CPU (it actually uses 64-Bit hardware).

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the GameCube's [=GameCube=]'s CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-Bit CPU (it actually uses 64-Bit hardware).
Is there an issue? Send a MessageReason:
None


So far, we've only discussed the UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit). This misadvertising is by no means limited to consoles, as video card manufacturers for PCs and Macs also play up to the hype, often claiming that more bits equals more vibrant colors or more realistic graphics compared to their competitors.

to:

So far, we've only discussed the UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit). This misadvertising is by no means limited to consoles, as video card manufacturers for PCs [=PCs=] and Macs also play up to the hype, often claiming that more bits equals more vibrant colors or more realistic graphics compared to their competitors.
Is there an issue? Send a MessageReason:
Fixing Broken Link


** {{MP3}}

to:

** {{MP3}}UsefulNotes/{{MP3}}
Is there an issue? Send a MessageReason:
None


Which is not to say that the leap in bits was entirely pointless: An 8 bit word can only hold a value of up to 255, while a 16 bit word can hold a value of up to 65,536 (which acts as a {{cap}} on anything the programmer wants to do: want a map with more than 256x256 squares? You're going to need to go to 16 bits for that, and in the NES, for example, that's gonna require more space than the CPU has easy access to). There is a point of diminishing returns, though: a 32 bit word holds a value of up to 4,294,967,296 (4 billion), which is more than enough for most purposes (and most 32 bit CPU instruction sets support 64 bit variables), so the jump to 64 bit words was usually driven more by bus limitations and memory growing larger than 4 gigabytes than by programmer needs.

to:

Which is not to say that the leap in bits was entirely pointless: An 8 bit word can only hold a value of up to 255, while a 16 bit word can hold a value of up to 65,536 (which acts as a {{cap}} on anything the programmer wants to do: want a map with more than 256x256 squares? You're going to need to go to 16 bits for that, and in the NES, for example, that's gonna require more space than the CPU has easy access to). There is a point of diminishing returns, though: a 32 bit word holds a value of up to 4,294,967,296 (4 billion), which is more than enough for most purposes (and most 32 bit CPU instruction sets support 64 bit variables), so the jump to 64 bit words was usually driven more by bus limitations and memory growing larger than 4 gigabytes than by programmer needs.
needs. However, the Advent of 3D games and the switch from tiles and sprites to polygons and textures does provide a valid need for 64-bit variables.
Is there an issue? Send a MessageReason:
None


Or take processor speed. The SuperNintendo's 5A22 CPU had a UsefulNotes/ClockSpeed of 3.58 [=MHz=] (megahertz). The [[UsefulNotes/SegaGenesis Sega Genesis/Mega Drive]]'s 68000 CPU had a clock speed of 7.67 [=MHz=], just over twice the Super NES. So that meant that Super NES games should only run half as fast, right? Sega did have this advertising campaign of "[[http://segaretro.org/Blast_processing Blast Processing]]", and some early Super NES games did have slowdown. When Capcom ported ''VideoGame/StreetFighterII Turbo'' to the Super NES, it had a secret mode that was faster than the UsefulNotes/ArcadeGame and just as fast as the Genesis version, with little noticeable slowdown. The mode turned out to be more gimmick than playable, but it showed that the clock speed was only part of how the processor worked, and proper use of the system was what made the game run so fast. It did not necessarily mean the Super NES is just as fast as the Genesis, however, as evidenced by the faster scrolling speeds of Genesis games like ''Franchise/SonicTheHedgehog'' and the 3D polygon graphics of Genesis games like ''VideoGame/HardDrivin'' and ''VideoGame/StarCruiser'' not needing any UsefulNotes/{{Cartridge}} enhancement chips (which the Super NES required for 3D games like ''VideoGame/StarFox'').

to:

Or take processor speed. The SuperNintendo's UsefulNotes/SuperNintendo's 5A22 CPU had a UsefulNotes/ClockSpeed of 3.58 [=MHz=] (megahertz). The [[UsefulNotes/SegaGenesis Sega Genesis/Mega Drive]]'s 68000 CPU had a clock speed of 7.67 [=MHz=], just over twice the Super NES. So that meant that Super NES games should only run half as fast, right? Sega did have this advertising campaign of "[[http://segaretro.org/Blast_processing Blast Processing]]", and some early Super NES games did have slowdown. When Capcom ported ''VideoGame/StreetFighterII Turbo'' to the Super NES, it had a secret mode that was faster than the UsefulNotes/ArcadeGame and just as fast as the Genesis version, with little noticeable slowdown. The mode turned out to be more gimmick than playable, but it showed that the clock speed was only part of how the processor worked, and proper use of the system was what made the game run so fast. It did not necessarily mean the Super NES is just as fast as the Genesis, however, as evidenced by the faster scrolling speeds of Genesis games like ''Franchise/SonicTheHedgehog'' and the 3D polygon graphics of Genesis games like ''VideoGame/HardDrivin'' and ''VideoGame/StarCruiser'' not needing any UsefulNotes/{{Cartridge}} enhancement chips (which the Super NES required for 3D games like ''VideoGame/StarFox'').
Is there an issue? Send a MessageReason:
None


So far, we've only discussed the UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit).

to:

So far, we've only discussed the UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit).
8/16-bit). This misadvertising is by no means limited to consoles, as video card manufacturers for PCs and Macs also play up to the hype, often claiming that more bits equals more vibrant colors or more realistic graphics compared to their competitors.
Is there an issue? Send a MessageReason:
None

Added DiffLines:

** UsefulNotes/JavaSoftwarePlatform
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the GameCube's CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-Bit CPU (it actually uses 64-Bit hardware).

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misadvertised UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the GameCube's CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-Bit CPU (it actually uses 64-Bit hardware).
Is there an issue? Send a MessageReason:
None


This marketing scheme began very early on, in fact. Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the GameCube's CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-Bit CPU (it actually uses 64-Bit hardware).

to:

This marketing scheme began very early on, in fact. The earliest recorded example of this being tried out is with NEC and the UsefulNotes/TurboGrafx16, which is, as evident in it's name, advertised as a 16-bit machine despite having a 8-bit CPU (the 16-bit part refers to it’s Graphics chipset, which was capable of more than 256 colors). Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the GameCube's CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-Bit CPU (it actually uses 64-Bit hardware).
Is there an issue? Send a MessageReason:


Or take processor speed. The SuperNintendo's 5A22 CPU had a UsefulNotes/ClockSpeed of 3.58 [=MHz=] (megahertz). The [[UsefulNotes/SegaGenesis Sega Genesis/Mega Drive]]'s 68000 CPU had a clock speed of 7.67 [=MHz=], just over twice the Super NES. So that meant that Super NES games should only run half as fast, right? Sega did have this advertising campaign of "[[http://segaretro.org/Blast_processing Blast Processing]]", and some early Super NES games did have slowdown. When Capcom ported ''VideoGame/StreetFighterII Turbo'' to the Super NES, it had a secret mode that was faster than the UsefulNotes/ArcadeGame and just as fast as the Genesis version, with little noticeable slowdown. The mode turned out to be more gimmick than playable, but it showed that the clock speed was only part of how the processor worked, and proper use of the system was what made the game run so fast. It did not necessarily mean the Super NES is just as fast as the Genesis, however, as evidenced by the faster scrolling speeds of Genesis games like ''Franchise/SonicTheHedgehog'' and the 3D polygon graphics of Genesis games like ''VideoGame/HardDrivin'' and ''VideoGame/StarCruiser'' not needing any {{Cartridge}} enhancement chips (which the Super NES required for 3D games like ''StarFox'').

to:

Or take processor speed. The SuperNintendo's 5A22 CPU had a UsefulNotes/ClockSpeed of 3.58 [=MHz=] (megahertz). The [[UsefulNotes/SegaGenesis Sega Genesis/Mega Drive]]'s 68000 CPU had a clock speed of 7.67 [=MHz=], just over twice the Super NES. So that meant that Super NES games should only run half as fast, right? Sega did have this advertising campaign of "[[http://segaretro.org/Blast_processing Blast Processing]]", and some early Super NES games did have slowdown. When Capcom ported ''VideoGame/StreetFighterII Turbo'' to the Super NES, it had a secret mode that was faster than the UsefulNotes/ArcadeGame and just as fast as the Genesis version, with little noticeable slowdown. The mode turned out to be more gimmick than playable, but it showed that the clock speed was only part of how the processor worked, and proper use of the system was what made the game run so fast. It did not necessarily mean the Super NES is just as fast as the Genesis, however, as evidenced by the faster scrolling speeds of Genesis games like ''Franchise/SonicTheHedgehog'' and the 3D polygon graphics of Genesis games like ''VideoGame/HardDrivin'' and ''VideoGame/StarCruiser'' not needing any {{Cartridge}} UsefulNotes/{{Cartridge}} enhancement chips (which the Super NES required for 3D games like ''StarFox'').
''VideoGame/StarFox'').
Is there an issue? Send a MessageReason:


So far, we've only discussed the CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit).

to:

So far, we've only discussed the CentralProcessingUnit UsefulNotes/CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the GraphicsProcessingUnit UsefulNotes/GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit).
Is there an issue? Send a MessageReason:
None


[[DescribeTopicHere Describe]] How Video Game Specs Work [[DescribeTopicHere here]], but in ''layman's terms'', please (like the ''Guinness World Records'', we're just here to head off [[InternetBackdraft hunting bets]], not prep you for IT 101.) This isn't TheOtherWiki.

to:

[[DescribeTopicHere Describe]] How Video Game Specs Work [[DescribeTopicHere here]], but in ''layman's terms'', please (like the ''Guinness World Records'', we're just here to head off [[InternetBackdraft hunting bets]], not prep you for IT 101.) This isn't TheOtherWiki.
Wiki/TheOtherWiki.
Is there an issue? Send a MessageReason:

Added DiffLines:

[[DescribeTopicHere Describe]] How Video Game Specs Work [[DescribeTopicHere here]], but in ''layman's terms'', please (like the ''Guinness World Records'', we're just here to head off [[InternetBackdraft hunting bets]], not prep you for IT 101.) This isn't TheOtherWiki.

Ah, video game specs. The fanboys like to use them as the ultimate indisputable weapon in their {{Flame War}}s. But they don't really know what they mean, for they barely look past one aspect of them.

Take the old bit size (8-bit, 16-bit, 32-bit, 64-bit, etc.). Most of us know that bit size isn't a real measurement of a system's power, but few of us even know why. It's actually ''word'' size; the measurement of how large a chunk of information can be handled by the processor at once.[[note]]This can mean the size of the registers, the innermost memory inside the CPU. Or it can refer to the width of the CPU's "data bus", the wires that transfer data between the CPU and main memory. A 64-bit data bus, for example, means that there are 64 data wires running from the CPU to the main memory board, each of which can carry one bit at a time, so the CPU can "read" or "write" that much data in a single system-clock tick.[[/note]] In theory, larger word size should mean faster processing, but it's not that simple--especially since computer manufacturers have figured out that people are using "word size" as a quick-and-dirty proxy for "fastness" (or worse, "realistic graphics"[[note]]While it's true that the number of bits dictate how many colors can be displayed on screen at a given time and the maximum resolution of the screen, the only hardware affected by this is the ''GPU'' and even then, the amount of memory the GPU has or is capable of addressing also plays a role in determining these capabilities[[/note]]), and started playing arcane tricks designed to boost word size at the possible expense of actual improvements in the rest of the architecture.

This marketing scheme began very early on, in fact. Even the UsefulNotes/{{Nintendo 64}}, for example, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons. Some people have gone so far as to call the [[UsefulNotes/SegaDreamcast Dreamcast]], UsefulNotes/PlayStation2, [[UsefulNotes/NintendoGameCube GameCube]] and UsefulNotes/{{Xbox}} generation "128-bit" thanks to this misinformation to this very day--out of these, only the Dreamcast and [=PS2=] were capable of 128-bit ''anything'' natively, and even then, it's not for general-purpose processing--and on top of that, the [=PS2's=] main processor is a 64-bit MIPS R5900,[[note]]Both the CPU and Vector Units have 128-bit Single-Instruction, Multiple Data (SIMD) instructions. However, these don't work on a single 128-bit value, but either 4 32-bit values, 8 16-bit values, or 16 8-bit values.[[/note]] although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the UsefulNotes/SegaDreamcast[[note]]Which had a 32 bit CPU, with a 128 bit vector unit, which Sega got marketing mileage out of by using this fact to advertise the whole system as being 128 bit[[/note]]. Nintendo also got on the bandwagon, as the tech demo for the [=GameCube=] was called ''[[http://en.wikipedia.org/wiki/Super_Mario_128 Super Mario 128]]'', implying that the console has a 128-bit CPU.[[note]]Its [=PowerPC=]-based CPU has a 32-bit word size with a double-precision 64-bit FPU, which many took to mean that the CPU was 128-bit since 2x64=128, although the real reason the demo is named as such is because it spawns [[ExactlyWhatItSaysOnTheTin 128 individual 3D Marios on screen]]). In terms of bus width, the GameCube's CPU has a 64-bit data bus, while its GPU has a 64-bit data bus and 512-bit texture cache.[[/note]] Occasionally, the bit misconception still pops up today, such as some mistaking the UsefulNotes/{{Xbox 360}} as having a 360-Bit CPU (it actually uses 64-Bit hardware).

Which is not to say that the leap in bits was entirely pointless: An 8 bit word can only hold a value of up to 255, while a 16 bit word can hold a value of up to 65,536 (which acts as a {{cap}} on anything the programmer wants to do: want a map with more than 256x256 squares? You're going to need to go to 16 bits for that, and in the NES, for example, that's gonna require more space than the CPU has easy access to). There is a point of diminishing returns, though: a 32 bit word holds a value of up to 4,294,967,296 (4 billion), which is more than enough for most purposes (and most 32 bit CPU instruction sets support 64 bit variables), so the jump to 64 bit words was usually driven more by bus limitations and memory growing larger than 4 gigabytes than by programmer needs.

However, what is done with the processor can have a significant difference in performance. An instructor in a class on mainframe programming gave an example. To clear a line of eighty 8-bit characters to blanks, such as for printing, the common practice was to put a space in the first character, then move the entire field from character 1 to character 2 for a length of 1 less than the size of the field. This moves character 1 to character 2, then to 3, and so on for 80 characters. What he pointed out was that even though arithmetic using floating point is much slower than operations of moving characters, it would have been faster to put, say, 4 blanks in a floating point register and store that 20 times than moving one character 80 times.[[note]]Now, it might be thought that because single-precision floating point numbers are 32-bit; on a 32-bit system, copying via either a move of 1 unit of information 80 times somewhere else will take the same time as 4 units stored 20 times, except that the store is 20 actions; the move a to a+1 single byte action is 80 actions, since the machine moves data 32 bits at a time even if only 8 bits are moved.[[/note]]

Or take processor speed. The SuperNintendo's 5A22 CPU had a UsefulNotes/ClockSpeed of 3.58 [=MHz=] (megahertz). The [[UsefulNotes/SegaGenesis Sega Genesis/Mega Drive]]'s 68000 CPU had a clock speed of 7.67 [=MHz=], just over twice the Super NES. So that meant that Super NES games should only run half as fast, right? Sega did have this advertising campaign of "[[http://segaretro.org/Blast_processing Blast Processing]]", and some early Super NES games did have slowdown. When Capcom ported ''VideoGame/StreetFighterII Turbo'' to the Super NES, it had a secret mode that was faster than the UsefulNotes/ArcadeGame and just as fast as the Genesis version, with little noticeable slowdown. The mode turned out to be more gimmick than playable, but it showed that the clock speed was only part of how the processor worked, and proper use of the system was what made the game run so fast. It did not necessarily mean the Super NES is just as fast as the Genesis, however, as evidenced by the faster scrolling speeds of Genesis games like ''Franchise/SonicTheHedgehog'' and the 3D polygon graphics of Genesis games like ''VideoGame/HardDrivin'' and ''VideoGame/StarCruiser'' not needing any {{Cartridge}} enhancement chips (which the Super NES required for 3D games like ''StarFox'').

Nowadays, most hardware engineers have given up on bumping clock speeds and gone on to improve bus speeds, which has less to do with how fast the computer "thinks" than with how fast it "talks". Taking the same example of the fourth generation ("16-bit era") consoles, the Super Nintendo's 5A22 has a 2.68 [=MHz=] data bus, while the Mega Drive's 68000 has a 5 [=MHz=] data bus. However, the 5A22 writes on every bus cycle (at 2.68 [=MHz=]), whereas the 68000 requires 4 clock cycles for every write (at 1.92 [=MHz=]). The advantage of the 68000 is that its data bus has a wider 16-bit bus width, twice that of the 5A22's 8-bit data bus. The 68000 thus writes 16 bits (2 bytes) at a time, giving it a 3.8 MB/s write speed, whereas the 5A22 writes 8 bits (1 byte) at a time, giving it a 2.6 MB/s write speed.

So far, we've only discussed the CentralProcessingUnit (CPU). Another equally important factor (or nowadays, ''more'' important) is the GraphicsProcessingUnit (GPU). Several marketing campaigns often referred to the GPU, yet many misunderstood them, erroneously assuming they referred to the CPU. The UsefulNotes/TurboGrafx16, for example, was advertised as a "16-bit" system because its dual GPU chipset had a 16-bit data bus, but many erroneously assumed the "16-bit" was referring to its 8-bit CPU somehow being 16-bit. Likewise, the UsefulNotes/NeoGeo was advertised as a "24-bit" system because its GPU chipset had a 24-bit data bus, but many erroneously assumed it was referring to its CPU processors (one 16/32-bit, the other 8/16-bit).

The most infamous example of such a misunderstanding is the "[[http://segaretro.org/Blast_processing Blast Processing]]" campaign of the UsefulNotes/SegaGenesis. Many erroneously assumed the term was referring to its CPU's faster clock speed, but it was in fact referring to its GPU's faster [[https://en.wikipedia.org/wiki/Direct_memory_access DMA]] (direct memory access). The Genesis GPU had a 13 [=MHz=] clock speed and 8 [=MHz=] bus speed, whereas the Super NES GPU had a 5.3 [=MHz=] clock speed and 3.5 [=MHz=] bus speed. This allowed the Genesis GPU to have a faster DMA write speed of 3.2–6.4 MB/s, compared to the Super Nintendo's 2.6 MB/s DMA write speed. This was what ''really'' gave the Genesis its faster performance, not just its CPU's higher clock speed.

So if you really want to know what specs actually mean for video game systems (which covers computers, set-tops, and handhelds), just take a look at the following pages.

See VideoGameSystems to see how these specs work for them.
----
!!Pages:
[[index]]
!!!Nitty and gritty:
* UsefulNotes/AnalogVsDigital
* UsefulNotes/BinaryBitsAndBytes
** UsefulNotes/PowersOfTwoMinusOne
** UsefulNotes/BinaryLogic
** UsefulNotes/BinaryPrefix
* UsefulNotes/ClockSpeed

!!!Hardware
* UsefulNotes/CentralProcessingUnit
** UsefulNotes/FlynnsTaxonomy
** UsefulNotes/MultiCoreProcessor
* UsefulNotes/DisplayTechnology
* UsefulNotes/GraphicsProcessingUnit
** UsefulNotes/GraphicsRendering
** UsefulNotes/VideoRAM
* UsefulNotes/RandomAccessMemory
** UsefulNotes/MemoryHierarchy
* UsefulNotes/ReadOnlyMemory
* UsefulNotes/MassStorage
** UsefulNotes/MagneticDisk
** UsefulNotes/OpticalDisc
*** UsefulNotes/CompactDisc
*** UsefulNotes/{{DVD}}
*** UsefulNotes/BluRay
** UsefulNotes/{{Cartridge}}
*** UsefulNotes/FlashMemory (including Solid State Drives)

!!!Software
* GameEngine
** UsefulNotes/{{Middleware}}
** UsefulNotes/PhysicsEngine
* UsefulNotes/OperatingSystem
** UsefulNotes/DOSBox
*** UsefulNotes/{{DOS4GW}}
** UsefulNotes/MacOS
** UsefulNotes/MicrosoftWindows
*** UsefulNotes/{{WINE}}
** UsefulNotes/{{UNIX}}
** UsefulNotes/ApplicationProgrammingInterface
*** UsefulNotes/CursesAPI
* ProceduralGeneration
* UsefulNotes/ProgrammingLanguage
** UsefulNotes/{{Python}}
** UsefulNotes/ScriptingLanguage
*** UsefulNotes/{{Lua}}
** UsefulNotes/TheCLanguage
* UsefulNotes/RandomNumberGenerator
* UsefulNotes/SoftwarePorting

!!!Related topics
* UsefulNotes/BitmapsSpritesAndTextures
** PixelVsTexel
** UsefulNotes/TextureCompression
* UsefulNotes/DigitalDistribution
* UsefulNotes/GamingAudio
** UsefulNotes/{{MIDI}}
** UsefulNotes/{{MOD}}
** {{MP3}}
** UsefulNotes/WavAudio
* UsefulNotes/PolygonalGraphics
* RetroGaming
* VideoGameCulture
* VideoGameInterfaceElements
** UsefulNotes/GeneralGamingGamepads

[[/index]]
----

Top