Follow TV Tropes

Following

History UsefulNotes / GraphicsProcessingUnit

Go To



The most notable systems to have CPU-driven graphics were the Atari 2600 and Sinclair ZX81. (A few "post-modern" systems also use this method; for example, the [[https://www.youtube.com/watch?v=_2uXqTi42LI Gigatron TTL]], as a GPU would violate the "computer without a microprocessor" gimmick of the system.)

to:

The most notable systems to have CPU-driven graphics were the Atari 2600 and Sinclair ZX81.ZX81; a notable (and slightly insane) variation was the [[https://www.youtube.com/watch?v=Kg6RwOVS334 Vectrex]]. (A few "post-modern" systems also use this method; for example, the [[https://www.youtube.com/watch?v=_2uXqTi42LI Gigatron TTL]], as a GPU would violate the "computer without a microprocessor" gimmick of the system.)


The early forms of this GPU were just triangle/texture renderers. The CPU had to position each triangle properly each frame. Later forms, starting with arcade systems like the [[http://segaretro.org/Sega_Model_2 Sega Model 2]] and [[http://gaming.wikia.com/wiki/Namco_System_22 Namco System 22]], then the UserfulNotes/Nintendo64 console, and then the first [=GeForce=] PC chip, incorporated triangle [[http://gaming.wikia.com/wiki/Transform,_clipping,_and_lighting transform and lighting]] into the hardware. This allowed the CPU to say, "here's a bunch of triangles; render them," and then go do something else while they were rendered.

to:

The early forms of this GPU were just triangle/texture renderers. The CPU had to position each triangle properly each frame. Later forms, starting with arcade systems like the [[http://segaretro.org/Sega_Model_2 Sega Model 2]] and [[http://gaming.wikia.com/wiki/Namco_System_22 Namco System 22]], then the UserfulNotes/Nintendo64 UsefulNotes/Nintendo64 console, and then the first [=GeForce=] PC chip, incorporated triangle [[http://gaming.wikia.com/wiki/Transform,_clipping,_and_lighting transform and lighting]] into the hardware. This allowed the CPU to say, "here's a bunch of triangles; render them," and then go do something else while they were rendered.


PC [=GPUs=] of that era were designed for static desktop acceleration, rather than video game acceleration, so PC [=CPUs=] had to render games in software. As such, PC games were unable to match the smooth scrolling of consoles, due to consoles using tile-based [=GPUs=], which reduced processing, memory, fillrate and bandwidth requirements by up to 64 times. It was not until 1991, with the release of ''[[CommanderKeen Keen Dreams]]'', that PC gaming caught up to the smooth 60 frames/second scrolling of the aging UsefulNotes/NintendoEntertainmentSystem.

to:

PC [=GPUs=] of that era were designed for static desktop acceleration, rather than video game acceleration, so PC [=CPUs=] had to render games in software. As such, PC games were unable to match the smooth scrolling of consoles, due to consoles using tile-based [=GPUs=], which reduced processing, memory, fillrate and bandwidth requirements by up to 64 times. It was not until 1991, with the release of ''[[CommanderKeen ''[[VideoGame/CommanderKeen Keen Dreams]]'', that PC gaming caught up to the smooth 60 frames/second scrolling of the aging UsefulNotes/NintendoEntertainmentSystem.


A simple equation. But then, the {{Dreamcast}} released with hardware bump mapping capabilities, so developers wanted to apply 2 textures to a triangle. So this function became more complex:

to:

A simple equation. But then, the {{Dreamcast}} UsefulNotes/{{Dreamcast}} released with hardware bump mapping capabilities, so developers wanted to apply 2 textures to a triangle. So this function became more complex:


See the [[Trivia/GraphicsProcessingUnit trivia page]] for notable graphics processing units over the years.

to:

See the [[Trivia/GraphicsProcessingUnit trivia page]] for notable graphics processing units over the years.


One fascinating fact: Some very early home computers and consoles lacked what we would call a "GPU", instead only having a chip for interfacing with the Monitor with the CPU, and thus lacked any kind of chip that we would call a Graphics Processing Unit--that is, while they had a graphics ''chip'', that chip was effectively run entirely by the CPU, with no ability to function on their own. This was only possible because of design quirks in analogue television and monitor standards. In short, the image displayed on analogue televisions is surrounded by a field of "blacker then black" used to align the image properly. Non-display Processing could only take place during the period that this overscan field was being "drawn". (Note that many early systems used this quirk as well, for other reasons--most notably, the horizontal blanking period was a good time to do things to achieve a parallax effect; for further information, go look up "hblank" and "vblank")

to:

One fascinating fact: Some very early home computers and consoles lacked what we would call a "GPU", instead only having a chip for interfacing with the Monitor with the CPU, and thus lacked any kind of chip that we would call a Graphics Processing Unit--that is, while they had a graphics ''chip'', that chip was effectively run entirely by the CPU, with no ability to function on their own. This was only possible because of design quirks in analogue television and monitor standards. In short, the image displayed on analogue televisions is surrounded by a field of "blacker then black" used to align the image properly. Non-display Processing could only take place during the period that this overscan field was being "drawn". (Note that many early systems as late as the 1990s used this quirk these quirks as well, well for other reasons--most notably, the horizontal blanking period was a good time to do things to achieve a parallax effect; for further information, go look up "hblank" and "vblank")


One fascinating fact: Some very early home computers and consoles lacked what we would call a "GPU", instead only having a chip for interfacing with the Monitor with the CPU, and thus lacked any kind of chip that we would call a Graphics Processing Unit--that is, while they had a graphics ''chip'', that chip was effectively run entirely by the CPU, with no ability to function on their own. This was only possible because of design quirks in analogue television and monitor standards. In short, the image displayed on analogue televisions is surrounded by a field of "blacker then black" used to align the image properly. Non-display Processing could only take place during the period that this overscan field was being "drawn".

to:

One fascinating fact: Some very early home computers and consoles lacked what we would call a "GPU", instead only having a chip for interfacing with the Monitor with the CPU, and thus lacked any kind of chip that we would call a Graphics Processing Unit--that is, while they had a graphics ''chip'', that chip was effectively run entirely by the CPU, with no ability to function on their own. This was only possible because of design quirks in analogue television and monitor standards. In short, the image displayed on analogue televisions is surrounded by a field of "blacker then black" used to align the image properly. Non-display Processing could only take place during the period that this overscan field was being "drawn".
"drawn". (Note that many early systems used this quirk as well, for other reasons--most notably, the horizontal blanking period was a good time to do things to achieve a parallax effect; for further information, go look up "hblank" and "vblank")

Added DiffLines:

!!CPU Driven Graphics

One fascinating fact: Some very early home computers and consoles lacked what we would call a "GPU", instead only having a chip for interfacing with the Monitor with the CPU, and thus lacked any kind of chip that we would call a Graphics Processing Unit--that is, while they had a graphics ''chip'', that chip was effectively run entirely by the CPU, with no ability to function on their own. This was only possible because of design quirks in analogue television and monitor standards. In short, the image displayed on analogue televisions is surrounded by a field of "blacker then black" used to align the image properly. Non-display Processing could only take place during the period that this overscan field was being "drawn".

Needless to say, this was horribly inefficient, but when the RAM needed for anything resembling a GPU was prohibitively expensive, it was also required to keep the costs of hardware down.

The most notable systems to have CPU-driven graphics were the Atari 2600 and Sinclair ZX81. (A few "post-modern" systems also use this method; for example, the [[https://www.youtube.com/watch?v=_2uXqTi42LI Gigatron TTL]], as a GPU would violate the "computer without a microprocessor" gimmick of the system.)


The term "GPU" was coined by [=nVidia=] upon the launch of their [=GeForce=] line of hardware. This was generally a marketing stunt, though the [=GeForce=] did have some fairly advanced processing features in it. However, the term GPU has become the accepted shorthand for ''any'' graphics processing chip, even [=pre-GeForce=] ones.

to:

The term "GPU" was coined by [=nVidia=] upon the launch of their [=GeForce=] line of hardware. This was generally a marketing stunt, stunt (as they [[BlatantLies claimed]] to have "invented the GPU" despite prior work by others), though the [=GeForce=] did have some fairly advanced processing features in it. However, the term GPU has become the accepted shorthand for ''any'' graphics processing chip, even [=pre-GeForce=] ones.


In the end, [=CPUs=] can execute a wide variety of programs at acceptable speed. [=GPUs=] can execute some special types of programs far faster than a [=CPU=], but anything else it will execute much slower, if it can execute it at all.

to:

In the end, [=CPUs=] can execute a wide variety of programs at acceptable speed. [=GPUs=] can execute some special types of programs far faster than a [=CPU=], [=CPU=] can, but anything else it will execute much slower, if it can execute it at all.



Rendering graphics is not the only task that meets the requirements for processing efficiently on a [=GPU=]. Many tasks in science and applied mathematics involve massive numbers of vector and matrix operations that [=GPU=] are ideal for solving. This has given rise to the field of GPGPU (general purpose [=GPU=]) computing. Most modern supercomputers use [=GPU=]s to perform the brute force parts of scientific computations - and to the chagrin of gamers, the mid-to-late [=2010s=] has seen a major shortage of high-end gaming [=GPUs=] due to the advent of cryptocurrencies causing cryptocurrency miners to buy them in bulk for their mining rigs, doubling or even tripling retail prices.

to:

Rendering graphics is not the only task that meets the requirements for processing efficiently on a [=GPU=]. Many tasks in science and applied mathematics involve massive numbers of vector and matrix operations that [=GPU=] are ideal for solving. This has given rise to the field of GPGPU (general purpose [=GPU=]) computing. Most modern supercomputers use [=GPU=]s to perform the brute force parts of scientific computations - and to the chagrin of gamers, the mid-to-late [=2010s=] has seen [=2017=] saw a major shortage of medium / high-end gaming [=GPUs=] due to the advent of cryptocurrencies cryptocurrencies, causing cryptocurrency miners to buy them in bulk for their mining rigs, doubling or even tripling retail prices.
prices. Retailers did wise up, and limited purchases to one card per person, but the price gauging lasted a good full year before going back down to normal around August [=2018=].


However, starting in the mid-90s with the creation of the [=DirectX=] (and specifically [=DirectDraw=]) API, as well as the introduction of hardware-accelerated [=QuickTime=] cards on the Macintosh front, games were finally able to take advantage of blitting, and the future of PCs moved from one that is mostly business oriented to one that is more oriented to gaming and multimedia.

to:

However, starting in the mid-90s with the creation of the [=DirectX=] (and specifically [=DirectDraw=]) API, as well as the introduction of hardware-accelerated [=QuickTime=] cards on the Macintosh front, games were finally able to take advantage of blitting, and the future of PCs [=PCs=] moved from one that is mostly business oriented to one that is more oriented to gaming and multimedia.


This kind of GPU, introduced to home systems by the TMS 9918/9928 (see below) and popularized by the UsefulNotes/{{NES}}, UsefulNotes/SegaMasterSystem and UsefulNotes/SegaGenesis, forces a particular kind of look onto the games that use them. You know this look: everything is composed of a series of images, tiles, that are used in various configurations to build the world.

to:

This kind of GPU, introduced to home systems by the TMS 9918/9928 (see below) [[https://en.wikipedia.org/wiki/Texas_Instruments_TMS9918 Texas Instruments TMS9918/9928]] and popularized by the UsefulNotes/{{NES}}, UsefulNotes/SegaMasterSystem and UsefulNotes/SegaGenesis, forces a particular kind of look onto the games that use them. You know this look: everything is composed of a series of images, tiles, that are used in various configurations to build the world.


The UsefulNotes/NintendoEntertainmentSystem, for example, renders a 256x240 background and sixty-four 8x16 sprites at 60 frames/second, a tile fillrate equivalent to more than 4 megapixels/second, higher than what PC games rendered to a bitmap framebuffer until the early 1990s. The SegaGenesis renders two 512x512 backgrounds and eighty 32x32 sprites at 60 frames/second, a tile fillrate equivalent to more than 30 megapixels/second, higher than what PC games rendered in a bitmap framebuffer until the mid-1990s.

to:

The UsefulNotes/NintendoEntertainmentSystem, for example, renders a 256x240 background and sixty-four 8x16 sprites at 60 frames/second, a tile fillrate equivalent to more than 4 megapixels/second, higher than what PC games rendered to a bitmap framebuffer until the early 1990s. The SegaGenesis UsefulNotes/SegaGenesis renders two 512x512 backgrounds and eighty 32x32 sprites at 60 frames/second, a tile fillrate equivalent to more than 30 megapixels/second, higher than what PC games rendered in a bitmap framebuffer until the mid-1990s.



The 80486DX2/66, a high-end gaming CPU of the early 90s, ran at 66 [=MHz=] and could run 32-bit code as an "extension" to 16-bit DOS. While faster than the CPU of the UsefulNotes/SegaGenesis and SuperNES, that alone was not enough to surpass them, as both consoles had tile-based [=GPUs=], which [=PCs=] were lacking at the time. It was through various programming tricks that [=PCs=] were able to exceed the Genesis and Super NES, by taking advantage of quirks in the way early [=PCs=] and VGA worked. Creator/JohnCarmack once described the engine underpinning his company's breakout hit ''VideoGame/Wolfenstein3D'' as "a collection of hacks", and he was not too far off. It was also the last of their games that could run in a playable state on an 80286 PC with 1 MB RAM -- a machine that was considered low-end even in 1992 -- which serves as a testament to the efficiency of some of those hacks.

to:

The 80486DX2/66, a high-end gaming CPU of the early 90s, ran at 66 [=MHz=] and could run 32-bit code as an "extension" to 16-bit DOS. While faster than the CPU of the UsefulNotes/SegaGenesis and SuperNES, UsefulNotes/SuperNES, that alone was not enough to surpass them, as both consoles had tile-based [=GPUs=], which [=PCs=] were lacking at the time. It was through various programming tricks that [=PCs=] were able to exceed the Genesis and Super NES, by taking advantage of quirks in the way early [=PCs=] and VGA worked. Creator/JohnCarmack once described the engine underpinning his company's breakout hit ''VideoGame/Wolfenstein3D'' as "a collection of hacks", and he was not too far off. It was also the last of their games that could run in a playable state on an 80286 PC with 1 MB RAM -- a machine that was considered low-end even in 1992 -- which serves as a testament to the efficiency of some of those hacks.


Before the rise of Windows in the mid-1990s, most PC games couldn't take advantage of newer graphics cards with hardware blitting support (or even the framebuffer, for that matter). While the first ''accelerated'' graphics card came out in 1991, many games that were written ignored the features- the CPU had to do all the work, and this made both a fast CPU and a fast path to the video RAM essential. [=PCs=] with local-bus video and 80486 processors were a must for games like ''VideoGame/{{Doom}}'' and ''VideoGame/{{Heretic}}''; playing them on an old 386 with ISA video was possible, but wouldn't be very fun. The only program that could remotely take advantage of the features found in these new cards was Windows 3.0, and even then games for the platform were mostly graphically simple as it was deemed to be too taxing to handle graphics-heavy games.

to:

Before the rise of Windows in the mid-1990s, most PC games couldn't take advantage of newer graphics cards with hardware blitting support (or even the framebuffer, for that matter).matter- as a matter of fact, many earlier GPUs drew directly to the screen memory and lacked a framebuffer). While the first ''accelerated'' graphics card came out in 1991, many games that were written ignored the features- the CPU had to do all the work, and this made both a fast CPU and a fast path to the video RAM essential. [=PCs=] with local-bus video and 80486 processors were a must for games like ''VideoGame/{{Doom}}'' and ''VideoGame/{{Heretic}}''; playing them on an old 386 with ISA video was possible, but wouldn't be very fun. The only program that could remotely take advantage of the features found in these new cards was Windows 3.0, and even then games for the platform were mostly graphically simple as it was deemed to be too taxing to handle graphics-heavy games.


Before the rise of Windows in the mid-1990s, most PC games couldn't take advantage of newer graphics cards with hardware blitting support (or even the framebuffer, for that matter). While the first ''accelerated'' graphics card came out in 1991, many games were written to do most of the work on the GPU; the CPU had to do all the work, and this made both a fast CPU and a fast path to the video RAM essential. [=PCs=] with local-bus video and 80486 processors were a must for games like ''VideoGame/{{Doom}}'' and ''VideoGame/{{Heretic}}''; playing them on an old 386 with ISA video was possible, but wouldn't be very fun. The only program that could remotely take advantage of the features found in these new cards was Windows 3.0, and even then games for the platform were mostly graphically simple as it was deemed to be too taxing to handle graphics-heavy games.

to:

Before the rise of Windows in the mid-1990s, most PC games couldn't take advantage of newer graphics cards with hardware blitting support (or even the framebuffer, for that matter). While the first ''accelerated'' graphics card came out in 1991, many games that were written to do most of ignored the work on the GPU; features- the CPU had to do all the work, and this made both a fast CPU and a fast path to the video RAM essential. [=PCs=] with local-bus video and 80486 processors were a must for games like ''VideoGame/{{Doom}}'' and ''VideoGame/{{Heretic}}''; playing them on an old 386 with ISA video was possible, but wouldn't be very fun. The only program that could remotely take advantage of the features found in these new cards was Windows 3.0, and even then games for the platform were mostly graphically simple as it was deemed to be too taxing to handle graphics-heavy games.

Showing 15 edit(s) of 51

Top

How well does it match the trope?

Example of:

/

Media sources:

/

Report