Follow TV Tropes

Following

Media Notes / Graphics API

Go To

Graphics API is a type of Application Programming Interface that focuses on standardizing a way for applications to tell graphics hardware how to draw something on the screen.

1980s: Expanding graphics in the same computer system

Before the IBM PC architecture, most home computers were fixed in terms of graphics capabilities. Thus application developers only needed to worry about developing for that system. They didn't have to worry about differing capabilities with hardware outside of how much RAM the computer had and whatever peripherals were available. Most computers' graphical capabilities at the time were limited to two modes: a text mode which was higher resolution but could only draw from a fixed character set and a graphics mode which allowed drawing anywhere but at a lower resolution.

Once the IBM PC took off and the idea that home computers could be expandable, people started experimenting with making graphics hardware that replaced the default one from the factory. Initially, IBM Personal Computer systems shipped with either IBM's Monochrome Display Adapter (MDA) or Color Graphics Adapter (CGA). One of the first third-party graphics hardware was the Hercules monochrome graphics card, which expanded on the graphics mode of a computer to allow for non-English characters in text mode. When IBM tried to capture the home computer market with the IBM PC Jr which included better graphics, Radio Shack developed a compatible computer with the same graphics capabilities known as Tandy Graphics. Eventually IBM added on EGA and VGA, which meant that a computer in the late 80s could have one of at least five types of graphics hardware. If an application did not support a specific video hardware adapter, then it couldn't run on that hardware. Though hardware vendors strove to at least emulate or support one of the base IBM video modes. For example, Hercules despite being a monochrome adapter could act like a CGA card using special software, but it mapped the colors to different monochrome shades. IBM's EGA and VGA supported video modes from other IBM cards that came before it.

Eventually for PC-based systems, VGA and its descendants were chosen as the de-facto standard for 2D graphics. When it came to 3D accelerated graphics on PCs, everyone started doing their own thing once more. There were options around the mid 90s from 3dfx, ATI, Intel, Matrox, NVIDIA, PowerVR, Rendition, and S3.

1990s: The rise of 3D accelerators and their API

As programming for 3D graphics hardware was a different beast than programming for 2D graphics, companies who made 3D accelerators created an API specific to their hardware to make it easier to program for and take advantage of the graphics cards. The most famous of these was 3dfx's Glide API, especially since it had good timing of being released around the time Quake was, giving the API a Killer App.

When Microsoft was developing Windows 95, they created a set of API meant for games known as DirectX, one of the most notable being Direct3D. This was primarily to allow application developers a means to access the hardware in an OS environment than the basic DOS one, and possibly to avoid the situation where everyone was doing their own thing. However, by the time DirectX gained some traction, 3dfx's Voodoo cards were the de-facto 3D accelerators. While 3dfx cards did support Direct3D, most games offered a Glide render path. Eventually the hardware features that Direct3D required caught up to what Glide offered, allowing the rest of the companies to compete with 3dfx as long as they implemented Direct3D.

In the professional market, SGI was the 3D graphics hardware solution. They also had a proprietary API known as IrisGL. IrisGL had a problem that if an application used a feature of the API that the hardware did not support, the application would not run. To solve this, SGI developed OpenGL to provide an open software architecture and included ways to fill in feature gaps that hardware didn't support. At one point at the request of software developers, Microsoft talked to SGI to make a unified API known as Fahrenheit. But due to SGI having financial issues, Microsoft wanting to leverage their DirectX API, and fading interest in general, the Fahrenheit idea died off. OpenGL was left to a group known as the OpenGL Architecture Review Board (ARB)

Eventually 3dfx started falling behind and their Glide API was no longer the standard, allowing both Direct3D and OpenGL to take off.

A footnote should be made for the API MeTal, introduced by S3 for their Savage 3D chipset, which was their last hurrah. The only thing the API is remembered for to this day is it's extremely efficient texture compression algorithm known as S3 Texture Compression (S3TC). Due to how advanced it was at the time, S3 decided to license it to Microsoft to be included in DirectX once they realized MeTal wasn't making any headway.

2000s: DirectX and OpenGL lead the way

Eventually there was only two graphics API with widespread support: Microsoft's DirectX and ARB's OpenGL. For PC gaming, DirectX was typically used as Windows dominated the personal computer market. OpenGL still lived on in the professional sector, though major game engines like Unreal Engine and id Tech 3 and 4 also supported it.

Hardware design also had a shift in how it was done. In order for the hardware to have a chance at being bought, it had to conform to both DirectX and OpenGL. So manufacturers worked closely with Microsft and ARB to figure out which features should be standardized. Manufacturers also included optional features as extensions, but were rarely used in games due to being optional. However, some of these features did make it as a standard one way or another.

By the mid to late 2000s, it was looking like OpenGL was losing ground. A lot of it was lacking features that DirectX was pushing towards. Even John Carmack, an OpenGL diehard, admitted that Direct3D was better in terms of features. This was the reason why the then anticipated id Tech 5 engine used DirectX, whereas previous id Tech engines used OpenGL. One could argue because of Microsoft's entry in the console business, they were able to experiment and expand upon their API, with the Xbox and Xbox 360 sporting features that weren't even available to PCs at their time of launch. Eventually OpenGL was transferred over to Khronos in 2006 who has maintained the API since.

2010s: Hitting a wall and changing the paradigm

The New '10s hit it off with DirectX 11 and OpenGL 4.0. Both were now on par in terms of features but there was now a wall being hit: both relied on a single thread to compile graphics rendering commands that had to be done in order. This could lead to GPU downtime where a task could be worked on, but it's not the next one in line to be done. DirectX 11 tried to alleviate this by allowing multiple threads to create queues to be submitted later. However, this feature was optional and it wasn't something that could just be turned on and work. This single queue design caused bottlenecks in CPU intense scenes, limiting the performance of the GPU.

In 2013, AMD went to prove that more performance could be eked out of their cards and created a new API called Mantle. Mantle was designed to take advantage of multiple cores, improve efficiency in processing API calls, and allowing the application to directly use some features of the hardware. Initially Mantle was compatible with only AMD GPUs, claiming that other manufacturers could be compatible with it in the future. While Mantle showed promise, it also had the same problem with APIs of the 90s: few wanted to support what was seen as an API with limited compatibility.

These ideas wouldn't go unnoticed. When making Windows 10, Microsoft designed DirectX 12 to take the principles of what Mantle offered. Seeing the writing on the wall with Mantle, AMD offered it to Khronos as a basis to build their next generation graphics API to replace OpenGL: Vulkan. While DirectX 12 has gained wide application support in Windows, Vulkan has gained much more support due to being cross platform compatible. Though DirectX 11 and OpenGL remain as render paths due to the newer API requiring more technical expertise to use effectively.

Meanwhile, Apple, ever wanting to enforce consumer and developer lock-in, introduced their own 3D API, Metal (no relations to S3's earlier mentioned API) and deprecated future Mac OS support for OpenGL. This API is exclusive to Apple's hardware so it works with their iOS mobile devices up to their Mac lineups and built with a lower-overhead design like DirectX 12 and Vulkan. Response to this move, unsurprisingly, is somewhat negative as many developers who develop cross platform apps now fear the uncertainty and difficulty of making a Mac OS port of their games, while those who develop exclusively for Apple's ecosystem show indifference.

A common misconception is that DirectX 12 and Vulkan are "low-level" API, implying that they talk closer or more directly to GPU hardware note . They're more accurately called "lower-overhead" API. The primary difference between say DirectX 12 and 11 is that a DX12 allows an application to create as many queues as it wants with the graphics driver picking tasks of the queue that can fit in an execution time slice as well as providing data structures that represent graphics hardware for easier translation. That is, it's up to the application developer to manage rendering resources rather than the API, but the application can't directly control the GPU's processing cores or directly access memory contents. In fact, at least in Windows, DirectX and other graphics API run in the higher level user space, not the lower-level kernel space.


Top