How Video Game Specs Work
How Video Game Specs Work here
, but in layman's terms
, please (like the Guinness World Records
, we're just here to head off hunting bets
, not prep you for IT 101.) This isn't The Other Wiki
Ah, video game specs. The fanboys like to use them as the ultimate indisputable weapon in their Flame Wars
. But they don't really know what they mean, for they barely look past one aspect of them.
Take the old bit size (8-bit, 16-bit, 32-bit, 64-bit, etc.). Most of us know that bit size isn't a real measurement of a system's power, but few of us even know why. It's actually word
size; the measurement of how large a chunk of information can be handled by the processor at once.note
In theory, larger word size should mean faster processing, but it's not that simple—especially since computer manufacturers have figured out that people are using "word size" as a quick-and-dirty proxy for "fastness" (or worse, "realistic graphics"note
), and started playing arcane tricks designed to boost word size at the possible expense of actual improvements in the rest of the architecture (this marketing scheme began very early on, in fact; the Neo Geo
was advertised as a "24-bit" system, and the Nintendo 64
, while indeed capable of 64-bit calculations, ran with a 32-bit capacity most of the time for a lot of good reasons). Some people have gone so far as to call the Dreamcast
, PlayStation 2
, and GameCube
generation "128-bit" thanks to this misinformation to this very day—out of these, only the PS2
was capable of 128-bit anything
natively, and even then, it's not for general-purpose processing—and on top of that, its main processor is a 64-bit MIPS R5900 note
although Sony deliberately mismarketed it as being twice as much that due to the "more bits equals better graphics" belief still lingering, and to compete with the equally misnumbered Sega Dreamcastnote
. Even Nintendo got on the bandwagon, the tech demo for the GameCube was called Super Mario 128
, implying that the console has a 128-bit word size CPUnote
. Occasionally, the bit misconception still pops up today, such as some mistaking the Xbox 360
as having a 360-Bit CPU (it actually uses 64-Bit hardware).
Which is not to say that the leap in bits was entirely pointless: An 8 bit word can only hold a value of up to 255, while a 16 bit word can hold a value of up to 65,536 (which acts as a cap
on anything the programmer wants to do: want a map with more than 256x256 squares? You're going to need to go to 16 bits for that, and in the NES, for example, that's gonna require more space than the CPU has easy access to). There is a point of diminishing returns, though: a 32 bit word holds a value of up to 4,294,967,296 (4 billion), which is more than enough for most purposes (and most 32 bit CPU instruction sets support 64 bit variables), so the jump to 64 bit words was usually driven more by bus limitations and memory growing larger than 4 gigabytes than by programmer needs.
However, what is done with the processor can have a significant difference in performance. An instructor in a class on mainframe programming gave an example. To clear a line of eighty 8-bit characters to blanks, such as for printing, the common practice was to put a space in the first character, then move the entire field from character 1 to character 2 for a length of 1 less than the size of the field. This moves character 1 to character 2, then to 3, and so on for 80 characters. What he pointed out was that even though arithmetic using floating point is much slower than operations of moving characters, it would have been faster to put, say, 4 blanks in a floating point register and store that 20 times than moving one character 80 times.note
Or take processor speed. The Super Nintendo
's 65816 CPU had a Clock Speed
of 3.58 megahertz. The Sega Genesis
's 68000 CPU had a clock speed of 7.67, just over twice the Super NES. So that meant that Super NES games should only run half as fast, right? Sega did have this advertising campaign of "blast processing", and some early Super NES games did have slowdown. Well when Capcom ported Street Fighter II Turbo
to the Super NES, it had a secret mode that was even faster than the arcade speed, but no slowdown. The mode turned out to be more gimmick than playable, but it showed that the clock speed was only part of how the processor worked, and proper use of the system was what made the game run so fast (this was also shown when the game was ported to the Genesis, and also had no slowdown in that mode).
In fact, nowadays most hardware engineers have given up on bumping clock speeds and gone on to improve bus speeds, which has less to do with how fast the computer "thinks" than with how fast it "talks". Taking the same example of fourth-generation consoles, the 65816 could "talk" at full speed, while the 68000 could "talk" only once every four cycles, resulting in an actual speed of 1.92 MHz for the Genesis. But the 68000 could push 16 bits every bus cycle, twice that of a 65816, making them not too different in overall bus speed (3.58 MB/s vs. 3.84 MB/s).
So if you really want to know what specs actually mean for video game systems (which covers computers, set-tops, and handhelds), just take a look at the following pages.
See Video Game Systems
to see how these specs work for them.
Nitty and gritty: