Follow TV Tropes

Following

Will the "Bits=Better Graphics" misconception ever go away?

Go To

TheBeanerItWas Since: Sep, 2013
#1: May 1st 2014 at 7:47:07 PM

Recently I had to make changes and links to undermine the misconception that bits equals the power of console hardware, and I was wondering; is this knowledge common on the internet, or does the outdated "More bits are better" belief still prevail?

For anyone who wants to understand what I'm talking about, see How Video Game Specs Work.

Randomness4 Snow Ghost from The Land of Inconvenience Since: Sep, 2011
Snow Ghost
#2: May 1st 2014 at 7:54:09 PM

I believe that the "bits" would describe a game that would take up the space that they could fill inside cartridges and "C.D.s" as games progressed. The "bits" having nothing to do with console's actual hardware power, but it was just a way to tell what games the system could actually handle.

edited 1st May '14 7:58:01 PM by Randomness4

YO. Rules of the Internet 45. Rule 45 is a lie.
Surenity Since: Aug, 2009
#3: May 1st 2014 at 10:55:18 PM

I think the Sega Dreamcast was the last console to brag about how many bits it had (128, or so it claimed). Yet the 90's console wars leave a long shadow.

My tropes launched: https://surenity2.blogspot.com/2021/02/my-tropes-on-tv-tropes.html
KJMackley Since: Jan, 2001
#4: May 2nd 2014 at 12:35:49 AM

Even if it did some other misconception will take its place. In modern gaming processor and memory are considered the measuring stick of better graphics. The reality is the OS is the most crucial element, in conjunction with the architecture of the whole shebang. The thing is that people like assigning neat numbers to something to give it a Power Level.

SgtRicko Since: Jul, 2009
#5: May 2nd 2014 at 8:08:34 AM

Well the misconception is dying somewhat, because now the one attribute that everyone looks at to describe power is now the processor's RAM and the strength of it's cores, or sometimes the max resolution (ex 1080p). And even then, some of you would know that it's not entirely correct to judge by that either, because game designers have become rather crafty with using the resources of their machines.

PhysicalStamina Since: Apr, 2012
#6: May 2nd 2014 at 12:19:31 PM

Who even uses "bits" to describe a game's graphics in this day and age?

Clarste One Winged Egret Since: Jun, 2009 Relationship Status: Non-Canon
One Winged Egret
#7: May 2nd 2014 at 12:54:21 PM

Well, people still use 8 bit and 16 bit to talk about retro gaming stuff. But since they also use to to refer to music I don't think "better graphics" is really a part of it.

Balmung Since: Oct, 2011
#8: May 2nd 2014 at 2:44:12 PM

Usually, they refer to the era and the capabilities of the consoles that boasted those numbers.

8-bit = NES era (or style, in the case of retraux)

16-bit = SNES and Genesis era (or style for retraux)

edited 2nd May '14 2:45:12 PM by Balmung

Add Post

Total posts: 8
Top