Follow TV Tropes


Platform / IBM Personal Computer

Go To

Born in the wake of the Apple ]['s success, the IBM Personal Computer (dubbed the "5150" in IBM's internal numbering system) was the International Business Machines Corporation's official entry into the desktop computer system market, and by far their most successful. Earlier attempts, like the 5100 desktop APL machine and the DisplayWriter word-processing machine, hadn't taken off, and IBM needed something fast to compete with Apple. Bypassing the usual IBM bureaucracy, in 1980 they tasked a team of engineers in an IBM office in Boca Raton, Florida with developing the new machine and gave them an unusual amount of freedom in developing the system.

What appeared in August 1981 was nothing like any IBM machine built before. Like the Apple II, the IBM PC was built almost completely out of off-the-shelf parts and had a generous amount of expansion capability. As for the system design, the Boca Raton team considered several processors (including IBM's own ROMP CPU note  and the Motorola 68000) before settling on Intel's 16-bit 8088. The 8088 was chosen mainly for cost and time-to-market reasons; the ROMP was still experimental and IBM was concerned that the 68000 wouldn't be available in quantity. Also, the 8088 could re-use many of the support chips Intel had designed for the 8085, making the motherboard design simpler. To ensure a steady supply of 8088s, IBM and Intel recruited Advanced Micro Devices (AMD) to act as a second source, a decision that would have some importance later.

The other big influence on the IBM PC's design was the world of S-100 machines, which were based around the Intel 8080 (or, later the Zilog Z80) and the "S-100" bus that had been introduced in the pioneering Altair 8800. These machines ran an OS called CP/M, which had been invented by a programmer named Gary Kildall in 1974 and was based indirectly on Digital Equipment Corp.'s various operating systems for their PDP series of minicomputers. While they weren't nearly as slick as the Apple ][, S-100 machines were popular with hobbyists and businesses alike, and several CP/M applications for businesses, like WordStar and dBASE, were making inroads.

S-100 machines were large, server-style boxes with a large number of slots inside, plugged into a central backplane with power and data signals on it. The cards themselves were large and nearly square. To save space, IBM decided against using the S-100 backplane system, and instead went with Apple II-style cards that were long and rectangular, with a 62-pin edge connector near the back end of the card. IBM also added a sheet-metal bracket to the back of the card to add some structural stability. Since the PC used a regulated, switching power supply, the hot-running secondary regulators that S-100 cards used were also no longer necessary.note 

In one break with the Apple II's precedent, and as an improvement on the serial consoles S-100 machines used, IBM decided to leave the graphics system off the motherboard and provide two add-on cards — a text-only Monochrome Display Adapter (MDA),note  intended for business users, and a Color/Graphics Adapter (CGA), for games, education and emulating other color-capable IBM hardware. This was done to allow buyers a choice in the video hardware, as well as to save space on the motherboard. While MDA was widely praised for its outstanding clarity and readability, especially when combined with IBM's original 5151 monitor, which showed off MDA's effective 720×350 resolution, CGA had a barely adequate 320×200 (with strange, unnatural-looking palettes, to boot) and a distorted, monochrome-only 640×200 and was nearly universally panned. CGA's composite mode, on the other hand, supported 16 colors by exploiting NTSC artifacting, comparing favorably to the Commodore 64 and the Apple ][, as this video by The 8-Bit Guy shows. IBM intended for most people to use the composite mode with a TV using an RF modulator or a dedicated composite monitor, just as with other home computers of the era. Many early PC games were written with composite CGA in mind. Composite CGA had the downside of 80-column text being unreadable due to artifacting, making it useless for word processing. People who wanted the 16-color mode and needed to work with text either used two monitors, one composite or RGB, or used a monitor that could switch between composite and RGB modes. Some composite monitors could switch to monochrome for working with 80-column applications, or people using a TV could just turn the color knob all the way down.

In a burst of brilliance, the engineers of the PC decided to make it possible to install both MDA and CGA cards in the same machine, creating the earliest instance ever of a multi-monitor PC setup. While the setup was pricey, note  tricky to configure, note  and required more desk space than a typical setup of the time, many professional-grade software packages (mostly desktop publishing, engineering and development software) could take advantage of the setup. Unfortunately, many consumer-grade software programs, especially games, were designed to presume only the CGA card was installed and did arcane things in the memory space normally used by the MDA card. This caused issues that ranged from glitching to outright system crashes when said software was run on a PC with both cards installed, relegating such configurations to the offices of professionals, and then typically only on workstation-class machines, until Windows 98 revived the idea by offering official support for multiple GPUs and spreading the desktop across multiple monitors on PCs.

Also, those living in PAL regions never got to experience composite CGA the way it's meant to be experienced. PAL's superior color separation, which is often touted as its strongest selling point, is also its downfall in this case, as it was highly advertised that PAL is immune to artifacting and thus composite output shows the same unnatural color palates as with a RGB monitor in PAL territories.note  Many PC clones sold in these regions, and even some clones in the US, lacked composite video output. This is compounded with the issue that earlier batches of the PC sold in PAL countries still had CGA cards that output in 60Hz NTSC, at a time where NTSC playback or even Multi-System TVs were almost non-existent, meant that composite was also barely if ever used in the region.note 

While there are other tricks to get more colors out of a CGA card, not all of them are officially documented,note  and have their own limitations; the 160x100 trick, which involves reprogramming text mode to provide a 16-color, tile-based graphics mode, does not work on certain clone chipsets (particularly those driving LCD panels on certain early laptops and portables), and doesn't work at all on EGA or VGA cards,note  while the palette switch trick requires precise timing, and frequently failed if the processor wasn't a genuine Intel 8088 running at the original 4.77 MHz. Still, it wasn't until the advent of EGA in 1984 that anything more adequate appeared, so everybody still used it — or a third-party Hercules monochrome card, which could address individual pixels, unlike the MDA, but was much more expensive. EGA solved the problems with CGA, offering 16 colors over the cleaner RGB connections.

The base system came with just 16K (upgradable to 64K), like the Apple II, but could be expanded to a then-breathtaking 640K thanks to the Intel 8088 processor inside, which had a 1 MB address space (huge for a desktop machine in 1981). It was possible to go above the 640K memory limit through the use of a combination of backfilling and "UMA" RAM, note  or by using EMS cards, which used bank switching (similar to some Nintendo Entertainment System game cartridges) to address the additional memory. While the memory cards were slow by today's standards, they were still faster than when programs swapped data off the even slower floppy drive. They also weren't cheap, but the fact that several business and professional programs could use it to speed up processing meant they still sold well until the PS/2 with its larger memory limit was introduced — and even then, it still emulated the EMS card as memory above 640K was presented by the EMM386 driver to programs as EMS memory, although newer programs could support accessing the amount of memory directly.

IBM followed up the PC with the XT in 1983, which removed the original PC's cassette interface, added more expansion slots (along with an optional expansion chassis), and made a hard drive option available. 1983 also saw the introduction of the PCjr, a severely crippled version of the XT intended for home use; its main claims to fame were the addition of a 16-color, 320×200 graphics mode and an internal 4-voice PSG (the same Texas Instruments model used in their own TI-99 series and, more famously, in the ColecoVision), both of which inspired one of the most famous clone families, the Tandy 1000. Next was the PC/AT in 1984, which introduced the 80286 processor and a fully 16-bit architecture, along with the Enhanced Graphics Adapter (EGA), which finally made 16-color graphics (in resolutions all the way up to 640×350, although 320x200 and 320x240 were the most popular resolutions) possible on a regular PC. And with that, with the capacity for attractive applications and especially entertainment software, the march of history began...

The Rise of the Clones

At first, the IBM PC didn't have much to offer home users and gamers. It was new, expensive, not as good with graphics as the Apple ][ or the Atari 800, and was directed squarely at business users. However, IBM's name on the machine made it a safe buy for businesses that already used IBM hardware, and they ended up buying the machines in droves. The machine's open design sparked a huge third-party expansion market, with dozens of vendors selling memory expansion boards, hard drive upgrades and more. It wasn't long until other computer makers started examining the PC's design and figuring out how to make clones of the machine that could run PC software without issues. The one thing stopping them, however, was the ROM. IBM had a copyright on what they called the "ROM BIOS", and while cloning the hardware was easy, cloning the ROM would be much harder, with few vendors able to get it completely right (and the few that tried too hard, such as Eagle, getting sued into oblivion). It wasn't until Compaq introduced the Portable in 1983 that a truly 100% IBM compatible PC was available, and after that, software houses such as Phoenix,note  Awardnote  and American Megatrendsnote  followed suit, opening the floodgates to an entire industry of low-priced PC compatibles.

IBM also had another problem to deal with: Microsoft. When the PC was first being developed, IBM decided they wanted to license an outside OS rather than attempt to write their own, and their first choice would have been CP/M. However, when they tried to meet with Gary Kildall to license it, he wasn't around to sign the papers; the full details are unclear and have become something of a legend, but in the end, IBM didn't get CP/M. What they did get was the product of another little-known Seattle software developer's own frustration with CP/M: MS-DOS. MS-DOS began life as an admittedly "quick and dirty" clone of CP/M, written by a developer named Tim Patterson at Seattle Computer Products.

SCP was mostly a hardware outfit, whose business was in memory upgrades and other add-ons to the aforementioned S-100 machines. When the 8086 appeared on the market, they wanted to use it and quickly threw together a prototype machine. Digital Research had promised an 8086/8088 port of CP/M for a long time and never delivered until it was too late, leading Patterson to write his own and name it "86-DOS" or "Q-DOS" (depending on who you ask). Microsoft, who had already basically lied to IBM and said they had something ready (they did; it was called Xenix—but it was a UNIX OS, and IBM wouldn't accept that. Xenix was later sold to the SCO Group), paid SCP and Patterson $50,000 for the rights to 86-DOS, did some quick editing and released that as MS-DOS/PC-DOS 1.0. Microsoft also put language in their contract with IBM, stating that they had the right to license MS-DOS to whoever they wanted without first seeking IBM's approval. This had serious implications for the PC clone business, as it meant that once the clone makers, American Megatrends, Award and Phoenix had opened the floodgates on the hardware side, Microsoft could sell the hardware makers MS-DOS, thus creating a complete package — and a huge pain for IBM.

While Microsoft intended MS-DOS to be a universal operating system where applications could be written once and run anywhere (similar to UNIX), its programming interface was so poor that many software developers bypassed it and directly accessed the hardware. Even Microsoft itself was guilty of this, with early versions of Microsoft Flight Simulator often used as a compatibility benchmark. A number of manufacturers did introduce MS-DOS-based computers, but they all failed in the marketplace because they weren't fully IBM compatible.

Lotus 1-2-3, a spreadsheet that was the IBM PC's Killer App, was praised for its speed because it had been tightly coded in assembler and directly accessed the hardware of the PC. This meant that any clone would need to be as compatible with the hardware as possible in order to run Lotus 1-2-3, which became a litmus test of PC compatibility. This would have consequences for the evolution of the platform's hardware, particularly the "640k barrier."

"IBM compatible" became to the personal computer industry what VHS was to the VCR in the '80s. While some other platforms might have been technically better, that didn't matter compared to the huge variety of software and hardware peripherals available for the PC, much like how the main advantage of VHS was the large amount of content available for it. The PC (or more importantly MS-DOS on x86 processors) became a de-facto standard everyone rallied around, and proved to be a safe bet for new companies entering the computer market. The earliest clone manufacturers were Eagle, Columbia Data Products and Compaq. A number of small mom-and-pop operations sprang up using cheap parts from Asia.

In terms of gaming, the Tandy 1000 was the turning point. A clone of the ill-fated IBM PCjr, the Tandy 1000 featured greater PC compatibility and expansion using standard expansion cards instead of sidecars while keeping the enhanced graphics and sound. With the porting of King's Quest, it proved that the PC could be a viable gaming platform. The Tandy 1000 and other cheap clones such as the Leading Edge Model D shifted the home computer market in the U.S. away from the Commodore 64 to the PC in the latter half of the 1980s. Such was the strength of the PC market that even Commodore and Atari came out with their own PC clones.

IBM Tries To Win Back The Crowd

With all of the pieces in place, the clone market took off like a shot after 1984. New companies building PCs based on cheap, mass-market "motherboards" made in factories in Taiwan, mainland China, and occasionally Japan, Korea or Malaysia were popping up everywhere, and Compaq became a Fortune 500 company on the success of its Portable and Deskpro ranges. In 1986 Compaq beat IBM to the punch with the first PC to use the new, 32-bit 80386 processor. Between the clone armies and Compaq's meteoric rise, IBM decided that if it couldn't compete on price, it would compete on features, and (it was hoped) introduce a new standard that they alone would control.

The result was the Personal System/2 (or PS/2 for short), a line of new PC-based machines that were deliberately much different from the prevailing PC standards. The new machines used a new, IBM-proprietary expansion bus called "Micro Channel", which was faster than the AT's bus (by now referred to as "ISA" for "Industry Standard Architecture", since IBM claimed a trademark on "AT") but completely incompatible and protected by IBM patents, requiring anyone who wanted to use it to go through a lengthy licensing process and pay royalties. The other major feature the PS/2 line introduced was a new video subsystem called Video Graphics Array (VGA), a substantial upgrade to EGA that added a new 640×480 high-resolution mode (familiar now as the mode Windows 2000 and XP use for their splash screens), analog RGB video with an 18-bit palette (over 262,000 colors), and up to 256 colors on-screen at once. VGA was accepted by the rest of the industry enthusiastically, with 100% VGA compatibility becoming a must for video-card makers. note 

VGA proved very popular with game developers. What it lacked in tricks like sprites, blitting and scanline DMA, it compensated for by being tweakable (hacked 256-color modes were very popular, providing resolutions up to 360×480) and having high-speed, easy-to-use video memory. The base 320×200×256 mode, however, was the easiest and the fastest, due to its memory window fitting neatly into a 64k 8086 segment and not requiring the time-consuming bank switching 16-color modes used, and many groundbreaking games of the late 1980s and early 1990s were written with this mode in mind, including the Sierra and LucasArts point-and-click adventures, Wolfenstein 3-D, and Doom. 640×480×16 mode, on the other hand, was extremely popular with early graphical OSes and GUI-based DOS software, and it remains a barebone compatibility video mode in many modern OSes as well. Later IBM innovations, like 1024×768×16 XGA, were a few more-or-less standard modes in the swirling chaos generally known as "Super VGA".

While most PC gamers had been content with the old beeping speaker outside of those with the Tandy 1000's three-voice sound chip inherited from the PCjr, sound was rapidly improving in the late '80s as well independently of IBM. The AdLib and Roland MT-32 sound cards added synthesized music to PC games and were widely supported in the late '80s and early '90s, but the Creative Labs Sound Blaster, introduced in 1989, added digitized sound. This, combined with AdLib compatibility, made it an irresistible add-on for serious PC gamers.

Local Bus Wars

Micro Channel didn't fare so well, though. Only a handful of other PC makersnote  adopted the bus, and while a few outside peripheral makers made MCA-compatible devices, the vast majority were designed and built by IBM as build-to-order add-ons. Miffed at IBM's attempt to corner the high-end PC market, Compaq and several other PC makers introduced a competing standard called "extended ISA" or just EISA for short. EISA expanded the bus to 32 bits and added "bus mastering" support (which let the CPU do other things while a data transfer from, say, a disk was happening)note  and MCA-style, semi-automatic configuration, while maintaining compatibility with regular ISA cards. EISA was popular mainly on servers and high-end PCs; desktops didn't need that kind of bandwidth yet. Its auto-configuration system was later backported to ISA as "Plug and Play", as part of the development effort leading up to Windows 95.

In the mid-1990s there was another competitor bus, VESAnote  Local Bus (or VLB for short), which added a 32-bit wide, full-speed side channel to the ISA bus. VLB was originally designed to give bandwidth-hungry GPUs a faster connection to the system bus, but it was also something of a stopgap measure and didn't last long. Its biggest problem was that it was too deeply connected to the internals of the 486 processor, for which it was developed; the Pentium used a completely different memory bus setup, and converting between the two was notoriously difficult. Also, VLB's specification was not very rigid and almost all manufacturers tweaked it a bit. This lack of precision also made running anything other than video on VLB a potentially dangerous proposition, especially if Mass Storage was involved; most IDE controllers of the day generated their timing signals directly from the VLB, and if it was running faster than what the controller expected, bad things (such as missed interrupts) would happen. VLB also had mechanical problems. Physically, it was a high-density edge connector positioned next to (and on the far side of) an ISA-16 slot, so it could be used with full-length cards only (giving it a Fan Nickname of "Very Long Bus"); since many cases of the day weren't designed with full-length cards in mind, inserting a VLB card was often quite difficult and risked damaging the card, the motherboard, or the case.

In the end, Intel's new "Peripheral Component Interconnect" (PCI) spec won the "local-bus wars" due to its platform-agnostic nature and cleaner architecture. PCI was first announced in 1992, and became popular with the first PCI 2.0 machines in 1995, making VLB obsolete almost immediately. PCI used a single, 124-pin, high-density edge connector (much like 16-bit MCA), which made ISA-style short cards possible. Unlike EISA, PCI wasn't backwards-compatible with ISA electrically, but PCI cards could co-exist with ISA and VLB, and Intel deliberately made the specification easy to implement, meaning that a lot of the barriers to acceptance that MCA had were gone. It was also designed for future expansion, with "fast" and "wide" versions (66 MHz, 64-bit) designed into the spec instead of being third-party tweaks. PCI spread quickly, showing up on non-Intel-based machines like Macs and Suns, and was eventually deemed MCA's successor by IBM in the late 1990s.

Towards the start of the new millennium, a new standard was launched, "Accelerated Graphics Port" (AGP). The standard was not meant to replace PCI, but supplement it, as PCI was starting to bottleneck GPU performance. AGP saw several revisions, but the biggest issues were that some older cards could not run on newer motherboards and vice-versa, due to a change late into the standards' life. While AGP's era was uneventful, it was short-lived, as GPUs started improving in performance at an unprecedented rate, and very soon AGP was no longer sufficient. What happened next however was yet another bus war.

In the early-2000s, another bus war erupted, this time due to the PCI standards group splitting hairs. On one side was PCI-X, a standard that was completely backwards compatible with PCI but was bulky and rather expensive to implement (as it followed the VLB route and added an expansion next to the regular PCI slot, meaning PCI-X cards were longer and bulkier), and on the other was PCI Express, a ground-up In Name Only expansion promising very fast speeds but using a serial implementation (similar to USB and FireWire) that wasn't electrically backwards compatible with PCI at all (albeit it could co-exist with the PCI bus). While PCI-X had some heavy backing from Apple, Compaq and IBM, Intel themselves as well as numerous hardware manufacturers backed PCI Express, and while PCI-X was used in G5 Macs and IBM and Compaq servers, they were quickly discontinued (along with FireWire, which was eventually replaced on Apple machines with Thunderbolt) when Apple switched to Intel CPUs in the mid-2000s.

Wintel Comes And Wins

After years of being confined to what were basically fleet sales, IBM discontinued the PS/2 line and MCA in the mid-1990s, preferring instead to concentrate on the revived "IBM PC" (later changed to "ThinkCenter") brand (new, ISA/PCI-based machines sold as business desktops) and the highly successful ThinkPad line of notebooks, which was introduced in 1992. This marked the end of IBM's dominance of the PC clone market, with the balance of power now shifted to Microsoft, Intel and the clonemakers.

The introduction of Windows 3.0 in 1990 also finally made Windows a legitimate platform after several years of false starts; it placed higher demands on both graphics hardware and Mass Storage, and it was this need for better hardware that drove PC development.

Another side effect of the introduction of Windows 3.0 was the uptake of the mouse in the PC market. Until Windows 3.0, few PCs had a mouse, and users operated exclusively with a keyboard. This was largely due to the fact that the PC was originally command-line based, and had no use for a mouse; until the late 1980s, mice were considered an expensive novelty for anyone that didn't require a GUI, such as desktop publishing software users, and pretty much shares the same status quo as a "digitizer"note  and the Light Pen.note  The introduction of text-based menus and GUIs rendered in text mode (sometimes called "TUIs") didn't help much, as many of the menu solutions were poor retrofits onto existing applications, and many TUI-based applications worked just fine without a mouse.note  Previous versions of Windows didn't exactly set the world on fire as it was picky over memory configurations, lacked Killer Apps, and was difficult to set up. The introduction of the 80286 and 80386 CPUs with their protected mode which facilitated multi-tasking and breaking through the "640k barrier" much more easily, as well as a much easier to install Windows 3.0 helped increase Windows acceptance. Everything in Windows being full GUI-based meant that the mouse had a greater role to play as an input device, as navigation was much more difficult with the keyboard when you have multiple programs open (something not really possible without Windows, though a few DOS-based programs, like DESQview and the MS-DOS 5/6 DOS Shell, certainly tried). However, it should be noted that Windows only finally gained full acceptance in the PC market with Windows 3.1, which introduced much-improved multimedia support over Windows 3.0. Also, the Killer App for PC multimedia as a whole, Myst, required Windows 3.1 as a minimum to run.

With the introduction of VESA Local Bus and then PCI between 1993 and 1995, along with improved video, sound and storage hardware, the PC started to look less like a classic 8-bit computer with bolted-on upgrades and more like a high-end RISC workstation. The introduction of the second-wave Pentium in 1995 and the Pentium II and AMD K6 in 1997, along with the ACPI Application Programming Interface in 1998, blurred the distinctions even further, and convinced people that a cheap desktop could perform as well as an expensive UNIX workstation. AMD sweetened the deal further in 1999 with the announcement of the "x86-64" instruction set, which added 64-bit capability to the PC and fixed some of the 80x86's long-standing quirks; the first CPU to feature it was the Opteron, released in 2003.

This caught Intel in a bad spot. Intel had been working with Hewlett-Packard on a new, 64-bit processor called "Itanium", and had pretty much bet the company on itnote ; much like what Compaq, DEC and Microsoft had wanted to do with MIPS, and what Apple, IBM and Motorola had wanted to do with PowerPC, Intel had made ambitious plans to discontinue the Pentium and move the entire PC platform over to Itanium by the mid-2000s. The Itanium was not a regular CPU. It used a new architecture based on "very long instruction words", small programs in themselves that ran the CPU's insides more-or-less directly instead of relying on a decoder system to do it; this decoder was important in that it kept the CPU's multiple logic units and FPUs busy, a complex process that was (and is) extremely difficult to replicate in software. Because of this, the Itanium was hard to develop for, at least at first, and performance suffered because of the immature tools. On top of that, the chip was late to market and incredibly expensive once it did show up, leading to trade papers calling it "Itanic" since it seemed to be sinking fast. Despite all this, Intel stood by Itanium right up until the Opteron came out — and then changed their minds quickly upon seeing how popular it had become, working on plans to add 64-bit support to the Pentium 4 line as well as the then upcoming Core 2 processors. Since then, the Itanium has found a niche in replacing older RISC systems like DEC's Alpha, and HP's PA-RISC, and the Itanium 2 has found use in high-performance computing, but it never had the mass-market success Intel was hoping for. The Itanium has since been discontinued, and HP/HPE, who turned out to be its last customer, is now in the process of moving the systems and OSes that relied on it to x86-64.

Graphics Boost: VESA, Super VGA, graphics accelerators and the rise of 3D

When we last left the topic of PC graphics, the PC had gained the ability to show 256 colors at up to 320×200 (officially) or 360×480 (hacked) resolution or 16 colors at 640×480, which was pretty impressive at the time. However, as technology progressed, so did the graphical capabilities of the PC. Various card manufacturers surfaced and began trying to push for more colors at higher resolutions practically as soon as VGA was introduced, with vendors like ATI making cards making early "Super VGA" cards that had double the RAM of a normal VGA and could render 256 colors at the VGA's maximum resolution of 640×480.

All these cards appearing so quickly came at a price, though, and that price was standardization. Every vendor used different "mode numbers" for different screen resolutions and color depths, which meant that every DOS program had to have a separate driver for every card it wanted to support. One vendor, NEC (who made computers but was far better known at the time for their high-end "MultiSync" monitors), founded the Video Electronics Standards Association to makes sense of all the chaos and provide a single "Super VGA" standard.

What VESA did first was compile a standard set of modes every card could support, including extended high-resolution and 256-color modes. They also standardized a 16-color, 800×600 mode which was officially dubbed "the" Super VGA mode; while its visual quality was questionable (it was designed to be easy to implement on existing VGA hardware, and so used an interlaced, 56 Hz refresh rate), it was a huge deal at the time, especially with the release of Windows 3.0 not long after. Next, they provided a standard interface to these modes called the "VESA BIOS Extensions", or VBE, with the intention of standardizing the "mode numbers" and making it so that DOS software only had to support one virtual "card". VBE wasn't widely implemented until the mid-1990s, by which time most systems were moving to Windows 95 (which used its own video drivers), but VBE support was still very important for DOS games, especially those that required higher resolutions. Most famously, the DOS and Windows 3.1 versions of SimCity 2000 shipped with UniVBE, a universal VBE driver that filled in the gaps for people with cards made before about 1994.

In 1991, a company called S3 Graphics introduced a "graphics accelerator", which is all in all a fancy name for their graphics chip. However, true to its name, their graphics chip (whose model number was a nod to the Porsche 911) was impressively fast — graphics were smoothly rendered, windows opened fast and felt snappy, and graphics popped onto the screen instead of appearing line-by-line. The secret to the speed was that it implemented two things. Remember how we said that VGA lacked blitting abilities? The S3 accelerators fixed that by implementing blitting (specifically, a variant called “bit-blitting”), removing the limitation from the list. It also implemented double buffering, allowing the very fast accelerator GPU to draw into special graphics RAM, which acted as a buffer, before displaying the end result on screen, which allowed for smoother animations as well as giving the illusion of things popping up immediately as the (faster) GPU now rendered graphics to the buffer while the VGA DAC (which is frame-locked to the refresh rate of the monitor and thus is much slower) pulled completed frames off the buffer to display on the screen. Many card manufacturers picked up the chip and started pumping out graphics cards that utilized it, while competing chip manufacturers tried to ape them by making their own accelerators, with varying degrees of success.

Then in 1995, the 3D craze hit. While pseudo-3D graphics had been around for a while at this point in the form of Wolfenstein 3D, Doom and its various clones, the rise of multimedia, as well as hype caused by the first generation of VR glasses hitting the market and the seminal 1995 computer-animated film Toy Story, caused gamers to crave better, more realistic 3D. A startup called Nvidia released a 3D chip called the NV1. It was quickly picked up by Diamond Technologies who were looking to make their card stand out of the then already saturated "graphics accelerator" market. The card proved to be rather impressive, as it could replace both the video card and the sound card, and produced beautiful and fluid 3D graphics, but ultimately it was a flop due to the oddball choice of quadrilateral polygons, which sent both Nvidia and Diamond back to the drawing board when DirectX chose the more traditional triangular polygons when it was ratified instead.

Also, while the NV1 was good with both 2D and 3D graphics (so long as its proprietary API with quadrilateral polygons is used), the included sound codec on the card was a different story and couldn't even hold up against the cheaper SoundBlaster clones (the SoundBlaster being the de facto choice in sound cards at the time). This was an issue given that users most likely had to remove the existing sound card to prevent conflicts. Additionally, the sound card portion couldn't function reliably in DOS mode, being one of the earlier PCI sound devices in the market.

The only player that had any luck at the time was another startup called 3DFX. Their chipset could do 3D very well, but it could not render 2D at all. They quickly got around this by allowing the card to interface with the existing video card on the PC, and then composite the graphics output by the card over the 2D images generated by the existing card. While this required that the potential buyers of the card have a free PCI slot open on their PC's motherboard, their marketing department managed to spin this into a feature, and it was very well received as a result. 3DFX would eventually also invent the idea of parallel multi-processing with the Voodoo2 cards, as one could connect two Voodoo2 cards together to get better performance.

After the turn of the new millennium, the landscape started to change. ATI, previously a player in the "VGA clone cards" market, decided to jump on to the bandwagon as well shortly before the turn with its Rage line of 3D chipset, and had grown from strength to strength to become a formidable player in the 3D market. Nvidia, in the meantime, made a comeback with the Riva chipset, and like rival ATI, grew from its success. Companies like S3 and Matrox fell to the wayside. Others like Diamond filed for bankruptcy, and some like NEC left the PC market.note  By the mid 2000s, 3D graphics were an important requirement in video cards, and only two main players- ATI and Nvidia- stood. Eventually, AMD would buy up ATI, Nvidia would buy 3DFX, while Intel (after buying Chips & Technologies, who had popularised the term "chipset" and eventually became a specialist in portable computer graphics chips) eventually introduced their own graphics silicon that would be integrated into their CPUs. AMD and Nvidia would remain the top players in the 3D market.

Then, shortly before the turn of the decade into The New '10s, another shift happened. Graphics cards prior to this were strictly for graphics. With the introduction of the GeForce 8000 series of graphics cards, Nvidia spearheaded another change—Graphics cards could now also do general-purpose calculations in addition to graphics. This added a big boost to the processing power of many PCs, as previously, when an explosion occurred on screen, it was the CPU's task to calculate the debris' trajectory. This was CPU intensive, and meant that many games just didn't bother with the particles. Now, the CPU could actually have the graphics card do the particle physics as it renders the explosion onscreen instead of half-heartedly rendering an explosion. And as an added bonus, the card could be used for more than just physics—it could also be used to power the enemy's AI.

From The New '10s, the popularity and ability of graphics cards grew rapidly. Modern games, where the graphics pipeline is the main bottleneck, could be run across multiple displays simultaneously and at resolutions of up to 8K, could be done in stereoscopic 3D or in VR with ease. GPUs began to become more accepted in for uses beyond the scope of video games and graphics work, and today are perhaps the primary vehicle of computation in modern scientific computing, Artificial Intelligencenote , cryptocurrency, and blockchain technology. The number of graphics cards sold surged among consumers during the COVID-19 Pandemic due to the amount of interest surrounding both PC gaming and cryptocurrency increasing dramatically. The concept of a general-purpose GPU which could be arbitrarily and transparently used for any calculation has also become popular among developers, further cementing the GPU as an integral part of desktop computing.

Gaming audio: Here comes the band

One of the other downsides to the original PC was audio support. With the exception of the PCjr and the Tandy clones, the audio capabilities of a PC were no better than that of a baseline Apple ][, as they were equipped with only a basic beeper (though unlike the Apple II, you got a bit of help from having one of the system timers wired to the speaker, allowing its use as a one-voice divide-down synth). The first recorded audio device for the PC was the Texas Instruments SN76489, a General Instruments AY-3-8910 clone for the PCjr, which was co-opted by Tandy into several of their PC compatibles. However, the SN76489 was specific to those models of computers, and weren't available in add-on card form for owners of other machines. The first recorded "sound card" created for existing PCs was the IBM Speech Option card, which was an expensive kit meant for PC telephony (later also repurposed for speech therapy) and was not meant for consumer use. The first sound card marketed to consumers, however was the Covox Speech Thing, a sound device that plugged into the parallel port, but the first widely accepted card was AdLib's OPL2 offering — launching at the same time as IBM's Music Option card and Creative's Music System card—which won out due to its music quality (Creative's option was crude and used two Philips square wave PSGs, and as a result sounded really lackluster) and price (the IBM Music Option card was priced much more expensively as it targeted musicians; IBM had not considered gamers to be a target customer of the card). However, the AdLib was only a synthesizer card, and as such had no other audio capabilities outside of playing music and rudimentary sound effects (although there exists a hack to transform the card into a rudimentary PSG, few games supported it, and the audio quality was barely passable at best. Furthermore, once the AdLib enters PSG mode, music playback can no longer be performed until the card is reset. This means developers had to choose to use the card for either sound effects or music, but not both, although some developers like Epic Games managed worked around this by using an audio engine that puts the card into sound effects mode, then used tracker music and audio mixing to achieve sound effects). Covox saw this as an opportunity and advertised the Speech Thing as a great way to supplement the AdLib's musical capabilities. While the Speech Thing was a rather crude device and the audio was just slightly better than putting an AdLib into PSG mode, its relatively low cost meant that most gamers could get one to supplement the AdLib card, and more importantly, it ensured that background music could be played alongside sound effects.

Both were eventually dethroned by the Creative Labs SoundBlaster, who one-upped AdLib and Covox by not only offering both full AdLib compatibility and better quality audio, but also improved on the Speech Thing by providing PCM audio with DMA support, which made less work for the CPU, and a game port (for clones that didn't have one; many IBM machines do ship with a game port but not many clones had the port). Later models added a CD-ROM connector for hooking up a proprietary (and on some versions of the Sound Blaster 16, IDE/ATAPI and even SCSI) CD-ROM drive. However, up to that point, many gamers considered the sound card and CD-ROM drive a luxury item and most gamed only with basic PC audio. Few PC gamers used a joystick and even fewer gamed with a mouse, all while very few games came on CD-ROM for the PC, and those that did were mainly Compilation Rereleases or "talkie" versions of existing games on floppy disk.

Then Myst was ported over from the Apple Macintosh.

Myst was a Killer App for the sound card and CD-ROM drive on the PC platform (and arguably, also the killer app for Windows 3.1 and by extension, mice on the PC platform). Due to the amount of existing hype Myst was already getting among Mac gamers, PC gamers started buying the game and necessary equipment to see what all the fuss was about, and the hype began spreading over to the PC world. Overnight, many gamers changed their view on the necessity of owning a sound card and CD-ROM drive and bought them en masse (along with Windows and a mouse) just to play the game. This in turn led to a renaissance of talkie games for the PC; many publishing houses started putting out "enhanced with speech" versions of their games on CD-ROM, and many Edutainment Games that were previously Mac-only were ported over to the PC. Clones of Creative's sound cards started appearing, along with dozens of other original cards. Some like the Gravis UltraSound managed to give Creative a run for their money and competed for the prime spot in users' PCs, others like the Opti Mozart was quickly forgotten. Granted, Trilobyte Software came several months earlier with The 7th Guest, but because the game went to the PC first when most PC owners did not have a CD-ROM drive and sound card to play it in the first place causing it to lack the initial hype Myst already had, and was rated M which meant a much more limited market as opposed to Myst's E rating, it lacked hype and thus was overlooked up until when Myst was ported.

By 1996, the shift in direction for the PC's future was clearly visible. However, there was a problem: There was no one clear standard for the sound cards, with features like sample rates, synthesizer types and even wavetable support varying wildly. The closest thing to a standard PC sound cards had was the SoundBlaster standard, which was patented by Creative Technologies and meant that companies had to pay money to Creative to clone the card or risk being sued. This led Microsoft to declare a standard called "Windows Sound System", a rudimentary standard where all the sound cards supported the same basic set of features so Microsoft could write a universal driver that would support all of them. However, the idea was met with resistance and failed to take off, even if some manufacturers adopted it.

Then Intel came up with the AC'97 standard in 1997. This was more than a standard, it was a radical shift — where previously sound cards were devices that were added on as desired, this shift moved the sound chip from an add-on card onto the motherboard itself, giving the onboard audio a much-needed upgrade (although it was more of a supplement than an upgrade since the onboard beeper was still a separate part that could operate independently of the AC'97 chip). Various semiconductor manufacturers jumped on the bandwagon. However, the specification was too far ahead of its time; it did away with MIDI synthesizers in favor of software synths, and Windows was not yet properly optimized for multimedia (indeed, back then, only one program could have access to the sound card at one given time), and software synthesizers consumed a whole chunk of memory and CPU power, and sometimes conflict arises when a poorly programmed driver wouldn't let the synth play music while a different program is playing sound. Worst of all was the lack of MS-DOS support. This was compounded with the fact that immersive gaming was becoming a fad and gamers were clamoring for sound cards that could support more than 2 speakers- a fad that Intel had failed to anticipate. This meant that while the standard was embraced by casual users, gamers steered clear of it, preferring surround sound cards like the SoundBlaster Live! and Aureal Vortex.

Intel then tried again with the HD Audio standard in 2005, and were far more successful. It was also around this time that onboard audio became accepted as a sound source by gamers, and by 2013 few companies were making sound cards for consumers any more, with many sound cards since emphasizing less on reducing CPU load and more on prosumer use, sound clarity and game-breaking features like providing hardware-assisted enemy tracking in FPS games. Some prosumer cards even advertises the presence of hardware synthesizers, which is now a rarity in PC sound hardware.

The PC today

Today, the PC's various implementations are collectively the most popular desktop computer platform in the world, and have even made inroads into scientific and high-performance computing due to huge leaps in processing capability as well as an emphasis, post-Pentium 4, on power savings. Several attempts to update the PC using newer parts have come and gone; most of them failed after 1995 as the PC platform ended up absorbing most of the features that a switch to MIPS or PowerPC would have brought, including in large part the RISC philosophy itself. Most Intel x86 designs starting with the Pentium Pro function in a manner similar to RISC processors, dividing the instructions of x86 machine code into smaller "micro-operations" which would actually be executed by the processor, in contrast from the earlier x86 processors which executed instructions in patterns nearly unique to each instruction.

On top of all this, the rise of the clones meant that pretty much everyone was selling nearly identical systems with little to differentiate them; this made margins even tighter and made PC makers more reliant on advertising, system aesthetics (the aforementioned ThinkPad was one of the first PCs to buck the trend of the generic beige box), gimmicks such as "100x CD-ROM" software and other pack-ins, and, if all else failed, price. This environment made it much more difficult for a new player to enter, as any new system would almost certainly be more expensive than a regular PC would and would have the additional hassles of porting all the software. Apple was able to stick it out due to both clever advertising and innovative system design, but they, too, eventually switched to x86 in 2006.

Additionally, savvy users have started building their own computers, tired of the limitations found on pre-built machines. The base of such users has grown to a point that many gamers nowadays prefer to build their own PC than buy a pre-built brand name system, as well as a thriving industry making parts solely with such users in mind. Some manufacturers who started out making pre-built machines have also started offering parts independently, allowing gamers to customize the specifications of their machines to their tastes. The build-it-yourself market started relatively quietly in the late 1980s, then exploded during the 1990s, with magazines such as Computer Shopper becoming catalog-sized tomes dedicated to ads aimed at both the causal buyer and the DIYer.

So far, there has never been a successful competitor to x86 processors in the market, with most of their challengers either halting further development, as in the case of the Motorola 68000, Alpha, SPARC, MIPS, and Itanium, or relegated to embedded systems, as has PowerPC (though remained in use in Macs until the mid-2000s). However, ARM CPUs have seen massive increases in computing power since the mid-2000s, and have since become the become much more competitive with x86 designs. Chips using the ARM ISA have dominated consumer electronics for years because of their clean system design and low power usage, an important feature for battery-powered devices. By contrast, Intel had to use a radically different architecture for the Atom line of processors to compete in the low-power space, which could still not match the performance-per-watt ratio of competing ARM-based designs.note  Chips from Nvidia, Texas Instruments, Samsung and Apple show that ARM chips have a future beyond embedded systems, and with the consumer trends from desktops and to laptops and tablets, even Microsoft has began to offer ARM versions of Windows, starting with Windows 8 as the highly stripped-down "Windows RT" and to Windows 10 and 11 having full-featured ARM ports.

In the latter half of The New '10s, there were rumors that Apple have been planning to switch their desktop machines to the ARM ISA. After years without any confirmation, Apple finally announced that the transition was happening at the 2020 Worldwide Developers' Conference, and all Macs from late 2020 would use in-house ARM designs. The funny thing is, the ARM CPU was indeed originally created to power desktops, and was initially used in a desktop computer called the Archimedes from the UK-based firm Acorn Computersnote , however they shifted to embedded electronics after PCs gained dominance. Now that Apple is making the switch, it means that the architecture has come full circle.

As a footnote, IBM themselves left the personal computer business for good in 2005, selling their PC division to a Chinese company named Lenovo (hence Lenovo now sells the ThinkPad).


Because of the platform's Long Runner status and its eventual passing into the public domain, there are literally tens of thousands of different makes, models and configurations, made by hundreds of producers (both large vendors contributing to the development of the platform and garage tinkerers assembling their own dream machines) with even more custom-built by hand, so only notable models and some of the typical examples of an era will be presented here. Notable clones, particularly the IBM PCjr and the Tandy 1000, and consoles based on PC architecture (the Xbox, PlayStation 4, and Xbox One) have their own pages.

    open/close all folders 

    Original IBM machines 
Note that for most of the platform's history under IBM, these computers were aimed squarely at business users, and any entertainment capabilities like sound, multimedia and accelerated graphics were left to third parties. IBM basically ignored that market entirely.


The original machine that started it all, actually a third attempt to create a PC by IBM, and the first that really took off.

  • Intel 8088 processor running at 4.77 MHz
  • Optional Intel 8087 math co-processor
  • 64K RAM, expandable to 640K; some expansion systems could backfill up to 736K if you were using an MDA. Later programs allowed use of backfilling on CGA displays using the UMA memory scheme, assuming additional RAM chips to push the machine past 640k are installed. Memory can be further increased with EMS memory cards that plugged into the expansion slots, each card can hold up to 2MB of EMS memory and used bank-switching to swap data in and out of a specially allocated 64k memory region above the 640k conventional RAM area but below the 1MB limit.
  • Five 8-bit expansion slots
  • Keyboard and cassette ports
  • Optional single or dual 5¼" floppy drives — originally single-sided, single density, carrying just 160K
  • Could be ordered with either an MDA (80×25 text only with blink, bold, underline and reverse-video effects) or a CGA (80×25 or 40×25 text in 16 colors, 320×200 in four colors from 6 palates (four documented and two hidden), 640×200 monochrome; a hacked 160×100 16-color mode was also available)
    • CGA's 80×25 text mode was a joke. Unlike the similar MDA's mode it used 8×8 matrix for a one symbol, which, accounting for letter and row separations, left only 7×7 for symbol itself, at best. MDA's character tile resolution was 9×14, a feat unmatched at least until VGA, and allowing for a then unprecedented text clarity and readability — and not a small selling booster for the early models, which were used mostly as office machines. CGA characters, in contrast, looked ugly, grainy and were nigh unreadable. Several clone vendors (most notably Compaq and AT&T/Olivetti) remedied this by providing a double-scan text mode, which ran at either MDA's 720x350 (Compaq) and used its character matrix, or at 640×400 (AT&T, Olivetti, several "super CGA" and "super EGA" boards) and used a much more legible 8×16 character matrix.
    • It is possible to use both CGA and MDA cards at the same time, creating the earliest example of a multi-display setup- some programs can take advantage of both cards being present (ie SignMaster, where the bulk of the editing work is done on the MDA display, but when the "render" command is invoked, the program switches to the CGA display to create the final layout), some glitch out and do interesting things like show text on the MDA screen but graphics on the CGA screen (ie an early video pinball game called Wizard- which shows the score and lives remaining text on the MDA display but shows the pinball playfield on the CGA display), and some games just outright fail to run and outright crash the system instead.
  • It ran MS-DOS: a single-task, 16-bit Operating System that in its early days was basically a clone of Digital Research's CP/M, and didn't actually provide much in the way of an API except for a file system and limited memory management. Everything else was up to the programmer.


A somewhat improved model, introduced in 1983 and made to look less like an 8-bit toybox with bolted-on upgrades and more like its own thing. Otherwise same as the PC, except with more memory, standard floppy drives and a hard disk option.

  • The stock IBM XT used the same CPU as the PC, an Intel or AMD 8088 at 4.77 MHz. Some clones swapped this for a fully 16-bit 8086, and later "turbo XT" motherboards ran the 8088 or 8086 at 7.16, 8, 9.54 or even 10 MHz. Like it's predecessor, a 8087 Maths co-processor can be added on as an option.
  • Standard 128K or 256K RAM, up to 640K on board, backfillable to 736k if the MDA adapter is installed, and further expandable with EMS memory cards like the original PC. Later programs allowed use of backfilling on CGA displays using the UMA memory scheme, assuming additional RAM chips to push the machine past 640k are installed.
  • Floppies became double-sided, with 320K capacity (360K with PC/MS-DOS 2.0 and later)
  • Eight expansion slots, all 8-bit. The slot closest to the CPU was technically reserved for the expansion chassis' interface card, and thus had a few quirks with timing and addressing that kept some other cards from working. A few clones (most notably the Olivetti M24/AT&T PC 6300) attempted to widen the expansion bus to 16-bit, but this was never standardized; that would have to wait for the PC/AT (see below).
  • Several optional hard drives were offered, with 8-bit controller cards and 5M to 10M capacities; third-party options included the Plus HardCard (the first 25 mm or "1/3-height" drives, which made them thin enough to take up only one slot) or early SCSI implementations.
  • Cassette port removed


Main article: IBMPC Jr
An attempt to make the PC much more desirable to families than just adults, the PCjr came with advanced sound and video capabilities and an infrared cordless keyboard. However, its price point, the market being oversaturated with home computers at the time, several unprecedented problems in manufacturing causing delays, and to top it all off, its proprietary nature and inability to use parts meant for clones and other IBM PCs without adapters if at all, meant that it was a commercial flop.
  • A 8/16-bit Intel 8088 CPU running at 4.77 MHz, much like the original IBM PC. However, an 8087 co-processor upgrade cannot be installed without the use of third party kits, and even then some programs will not bother using the co-processor if they detect a PCjr unless a second workaround is used (using a program cartridge that spoofs the machine ID).
  • Standard 64K of onboard RAM, upgradable to 128K, the upgrade is actually needed to access even better graphics capabilities. Cannot go beyond 128K internally due to the design of the platform
    • "Sidecar" RAM modules are available, using XMS-like architecture. This means that the total amount of memory modules present is physically limited to the 1MB ceiling imposed by the 8088. EMS modules are possible, but were never produced for the system.
    • Some third-party memory upgrades (like ES Quality Products' HOT SHOT boards) got around the lack of internal memory slots by using an interposer board that fit between the CPU and the motherboard.
  • Video Gate Array (aka Tandy Graphics Adapter/TGA, not to be mistaken with Video Graphics Array that came with the PS/2 line of PCs). CGA-compatible plus an additional 160x200x16 graphics mode. Upgrading the internal RAM to 128k unlocks an additional two modes: 320×200×16 and 640×200×4. Sidecar RAM modules will not unlock the additional video modes.
  • Expansion modules are known as "sidecars" and can stack. XMS-like RAM modules are the most common, however other modules like additional sound modules and parallel port modules can also be obtained.
  • PC beeper sound, but an onboard SN76489 PSG appends to the audio capabilities by providing three square wave channels plus one noise channel. Officially, the PC beeper and the SN76489 cannot be used at the same time due to a design decision (to the annoyance of many developers), but hackers have found a way to quickly switch between the two to get an extra audio channel. You can however add a sidecar with a second SN76489 to the machine. Also annoyingly however, adding the sidecar disables all internal sound devices.
  • A single 5¼" double sided, double density drive with a storage capacity of 320KB (360 KB if you're running PC-DOS 2.10). However, there are also a pair of ROM cartridge slots under the drive, and the machine has the necessary connectors for a tape drive.
  • Two built-in joystick ports and a light-pen port are present around the back, further cementing the machine as a family computer. Additionally there is a serial port along with a keyboard cable port. Plugging in the optional keyboard cable eliminates the needs of batteries on the keyboard as well as disables the infrared port. In many cases the cable is desirable as the infrared port is unreliable and prone to interference.


First major upgrade to the system, it introduced a much faster, new-generation CPU, a major boost in performance and a promise of true multitasking on an inexpensive PC.

  • Intel 80286 processor running at 6 or 8 MHz, depending on age, 80287 co-processor an option.
  • 640KB onboard RAM, up to 16 MB "expanded" memory using add-on cards, using the same add-on EMS cards as the PC and PC/XT.
    • IBM also introduced the XMA card, a very early take at XMS memory, in this period. The XMA card also plugs into the ISA expansion slot like the EMS cards, but instead of presenting EMS memory, the board takes advantage of the 286's larger memory limit and presents memory directly above the 1MB block — like its successor, XMS memory. Like the EMS boards, they came in memory sizes ranging from 128KB to up to 2MB. However, the way this kind of memory works also meant that the board cannot be used on older PC/XT and the original PC machines. Furthermore, one can have only up to 14MB worth of XMA cards installed as the memory stacks above the 640KB conventional memory and counts towards the CPU's 16MB memory limit (as opposed to EMS memory, which kept the memory separate and presented a window to the EMS card in the reserved memory area instead).
    • It is also possible (but extremely difficult) to fit up to 16MB of RAM onto the motherboard itself through the use of correct size DIP socket RAM chips. Doing so also reduces the number of XMA cards that can be installed as those count towards the 16MB limit.
  • 20 or 40 MB high-performance hard drive — originally still an option, but a must by the end of the 1980s
    • The "AT Attachment" (ATA, commonly known as IDE) interface started here, not as an integrated controller at first, but as the Western Digital ST-412/floppy combo controller board IBM included with the AT. Later on, Compaq asked Western Digital to make a standardized interface for the controller that didn't require all of the 80-odd data, address, control and interrupt signals present on the AT bus, and they were able to reduce it to 40 pins (all 16 data lines, plus a simplified addressing scheme, interrupts and handshaking). After making a few prototype drives with Control Data (and shipping the Portable II with an ST412-to-ATA "bridge board" while the better ATA drives were being developed), Compaq recruited Conner Peripheralsnote  to start building ATA drives for them. By 1990, other manufacturers (including Western Digital themselvesnote , Seagatenote , Quantum note  and Maxtornote ) had followed suit, switching from ST-412 and low-end ESDInote  drives to ATA drives; ST-412 was dead not long after that, while higher-end ESDI drives were cut over to SCSI.
  • Single or dual "high density" floppy drives — up to 1.2 MB capacity, with "double density" 3½-inch 720KB floppy as an option. The later models supported the then-new (but eventually standard) "high density" 3½-inch 1.44MB floppy.
  • Optional MDA, CGA, EGA or the rare and expensive Professional Graphics Controller, an early GPU meant for CAD use. EGA was a major upgrade over the CGA, which finally brought high resolution (640×350) color graphics to the platform. EGA could display 16 colors out of a 64-color palette, and despite being pretty awkward to program,note  the stock mode carried a then-huge 64K of Video RAM and allowed a clever programmernote  to do tricks in software that were previously thought possible only on the consoles with their hardware accelerators, like smooth scrolling. However, due to the way the monitor interface was implemented, 64-color support was only guaranteed to work in 640×350 mode, which required at least 128K VRAM and worked best with the full complement of 256K.note  While many third-party cards shipped with 256K, most of IBM's didn't; between that and the need for an Enhanced Color Display, 640×350 mode was mainly used for office programs and professional work.
  • On the software side, it was the last model intended to run the "stop-gap" MS-DOS, with the next IBM range expected to use the new "grownup" OS—the much vaunted OS/2, which promised true multitasking and a graphical user interface.


IBM's attempt to fight the hordes of the clonemakers by releasing the much improved, but closed and patented MCA bus with an attendant computer platform in 1987. The range was met with mixed success (they were expensive, even compared to high-end competitors like Compaq, and much of their market share was from "no one ever got fired for buying IBM" fleet buyers), but some of the innovations introduced here have stuck around to this day.

IBM gave up on the MCA bus late into the line's run, however, when it became apparent that few third party manufacturers were willing to license the patent and build peripherals that used the slot; the last PS/2 machines shipped with the industry-standard ISA and PCI slots.

  • Intel 8086 to 80486DX4 processors in all available speeds from 8 to 100 MHz, co-processors (corresponding 8087 - 80487 parts) possible, a 90 MHz Pentium model was introduced in certain markets late into the line's life. 80486DX and Pentium models do not need a co-processor as the co-processor is built into the CPU itself.
    • IBM's decision to retain the by the time entirely obsolete 8086 and the aging and notoriously idiosyncraticnote  80286 processors in the lineup was the first step in the IBM/Microsoft OS quarrel that led to the birth of Windows NT.
  • Finally, a standard hard drive was included:
    • The 8086-equipped Model 25note /Model 30,note  as well as its upgraded successor, the Model 25/30-286, shipped with either a 20 or 30 MB model (using an IBM-proprietary version of what was then called "XT attachment" or "XT-IDE"). The 20 MB model was a slow, stepper-motor-based model, whereas the 30 MB model was voice coil-based and very similar in construction to later IBM H-series and Deskstar drives.
    • The next model up, the Model 50, had a very slow 20 MB drive (based on the one from the Model 25/30) as the base option; it used a 50-pin interface that was similar to the 26-pin mini-ST-412 connectors seen on Zenith and Toshiba laptops. With the right riser card (and, in some cases, a BIOS update), the 50 could be updated to the faster "MCA Direct Bus Attachment" drives, a variation on IDE. Confusingly, the "DBA" drives were often referred to as "ESDI" by IBM, since their integrated controllers emulated IBM's standalone MCA ESDI card.
    • The monstrous, full-tower Model 80 system used a discrete ST-412, ESDI or SCSI card, and could support up to two, equally-monstrous 5¼-inch full-height drives.
    • Later PS/2s used standard SCSI or IDE drives.
  • 512K (for a Model 25) to 128MB memory (for the Pentium 90 PS/2). The 80286 and newer system introduced XMS, or the eXtended Memory Specification, as well as the use of more modern slotted RAM modules. Memory between the 640k and 1MB location was reserved, memory above 1MB was presented as XMS, which is activated using the HIMEM.SYS driver. XMS-aware programs can access the memory via HIMEM directly, but older programs required that EMS be emulated using a special driver like QEMM or 386MAX. MS-DOS 5.0 onwards shipped with one such driver, EMM386, included in the box.
  • 5¼-inch floppies were replaced by the Sony-made 3½-inch ones, the same as on Macs. Unlike Apple-supplied CLVnote  variants, however, these were CAV,note  which made them cheaper and simpler to implement,note  but limited their capacity to just 720K and made the formatted floppies non-interchangeable between systems. Later, high-density 1.44M and 2.88M versions were introduced.
  • Tape drive backup an option on many higher-end PS/2 machines.
  • Another enduring innovation were the fast serial mouse and keyboard connectors that are to this day called just "PS/2 ports". The keyboard port was electrically identical to the full-size DIN port used on the PC/AT (and, subsequently, many clones); this meant that keyboards designed for the older standard would work on a PS/2 with just a passive adapter (and vice versa).
  • VGA was first introduced as on-board video on the higher-end PS/2s (it was designed from the start as a single-chip solution using semi-custom "gate array" technology, hence the name "Video Graphics Array"); an 8-bit ISA version called the "PS/2 Video Adapter" was announced concurrently.
    • VGA would end up exploding the PC gaming world. Not only did it introduce the new 640×480×16 color hi-res mode, but it also introduced the new, incredibly rich and colorful 256-color modes, and in the pretty nifty 320×200/320×240 resolutions to boot. These were the same resolutions used by CGA and EGA, letting game developers support VGA and the older modes simply by using different palettes. Moreover, unlike the EGA, these high-color modes were very fast and straightforward, using packed pixelsnote  and fitting into the same memory segment. Also, unlike EGA, all 262,000 colors the DAC could support were guaranteed to be available in every mode, even 320×200 and 640×200, which opened the door to SNES-like fade-ins and fade-outs and DOOM's gamma correction feature, among other effects. For the first time in history, PC graphics could seriously compete with consoles and the Amiga.
  • Not everything was as bright on the OS side, though. OS/2 was delayed, and when finally released, its original 1.0 version lacked the much-ballyhooed GUI (which was released only in version 1.2 about a year later), it couldn't run more than one DOS application (and even that not very well), it didn't work on 8086 CPUs at all, and what's more, the differing ideas that IBM and Microsoft had about its development eventually led to Microsoft ditching the project for good in favor of Windows, about midway through the development of what was intended to become "OS/2 3.0".note  It also worked entirely differently from MS-DOS, requiring programmers to learn the entirely new API and architecture, as well as purchase an extremely expensive developer's kit; all of this contributed to the perennial lack of software for it.
  • Several of the last PS/2 models had ISA and PCI slots instead of MCA slots.


IBM's attempt to make the PS/2 into something more home-friendly, after the success of the PS/2 line for business. The name is a bit of a misnomer, as the line came after the PS/2 machines were launched. Instead, the numbering scheme was meant to signify that the machine was a watered-down version of the PS/2. Taking a few notes from the original Apple Macintosh, the PS/1 featured an all-in-one design with numerous nods towards Apple's 1984 machine. Again, IBM's usage of proprietary connectors and ports, as well as the uncompetitively high price point compared to the PC clones, as well as lack of upgradibility for the lower end models, meant that it sold poorly for its intended market.

  • Intel 80286 to 80486 processors in all available speeds from 8 to 66 MHz, co-processors (corresponding 8087 - 80487 parts) possible, 80486DX models do not need a co-processor as the co-processor is built into the CPU itself.
  • 2-16MB of RAM, 2MB onboard, additional RAM is added on via proprietary modules- a point of contention among most users. Higher-end models accepted industry-standard RAM modules.
  • Onboard VGA graphics
  • No onboard upgradability on lower end models; higher end models have 2-3 ISA slots.
  • 3½", 1.44MB double-sided, high density CAV drives like the PS/2. 2.88MB drives are an option.
  • Onboard IDE controller, offered with drives of up to 253MB.
    • A CD-ROM kit was made available for the system. This kit replaces the system's cover, and supplies a ISA SCSI card to be used with a supplied SCSI CD-ROM drive. As noted, the kit will only fit higher-end systems that have ISA slots.

IBM Aptiva and NetVista lines, and the PC line revisited

With the Aptiva line, IBM finally gets it right with home computers. Priced to compete with the clones, the Aptiva was more widely accepted than IBM's earlier attempt at home PCs. Despite a custom motherboard design, the line is fully compatible with all PC cards and peripherals. Much fandom exists for this series. Also helping is that IBM contracted Acer, a third party clone maker famous for their Aspire home PCs, to design the line's casings, and the end result was a line of attractive case designs that appealed to families and younger users.

The NetVista line is basically the Aptiva's business-orientated counterpart, launched after the retirement of the PS/2 line. Sharing most of the same innards as the Aptiva, the main difference is that the machines come in more muted casing designs that made them more suited for offices than at home.

Late in the life of the line, the Aptiva branding was retired and the home machines reverted to using the IBM Personal Computer branding, while the NetVista models got rebranded into ThinkCenter, in line with their ThinkPad business-oriented laptops. This would also be IBM's final foray into the personal computer market, as IBM soon stopped manufacturing PC hardware and sold the division to Chinese electronics conglomerate Lenovo.

  • Intel 486DX2 through Pentium III or AMD Am486DX2 through Athlon, available in speeds from 66 MHz to 1.2 GHz
  • up to 1.5GB of RAM, depending on build the machine may use 36-pin SIMM, 72-pin DIMM or SDRAM.
  • Motherboard may have any combination of ISA, PCI and AGP slots depending on era.
  • SVGA graphics with acceleration.
  • Many Pentium-era models ship with the controversial IBM MWave sound card-modem combo. While the card is a marvel in design, compatibility was a big issue (especially due to lack of DOS support) that IBM was forced to partially refund customers so that they can replace said card with a industry-standard SoundBlaster and AT-compatible modem.
    • Starting with the Pentium II era, integrated AC'97 audio can be found on many of the PCs.
  • 3½", 1.44MB double-sided, high density CAV drive. 5¼" 1.2MB drives as well as 3½" 2.88MB drives are an option.
  • Common IDE hard disk.
  • CD-ROM drive standard on all Aptiva machines, and an option on NetVista machines.


Games (by no means an exhaustive list):

    Booter Games 
Games that were run by simply inserting the floppy and turning on the computer. Some of these do not use an operating system and booted into a custom-written loader which set up the required registers and memory before loading the game, others like Alley Cat licensed a copy of MS-DOS or DR-DOS, but had their autoexec.bat hardwired to start the game immediately on boot. They lost their popularity once hard drives became more popular, allowing gamers to just store them on the machine and not have to reboot it and insert a different disk.

    DOS Games 
Games that are run through DOS. These remained popular until the mid-90's when Windows replaced DOS as the OS on most computers. Windows has tried to keep compatibility with them throughout the years; however, with Windows no longer being DOS based, along with the move to 64-bit versions of Windows that don't support older 16-bit programs, the use of an emulator (such as DOSBox) is required at this point.

Exclusive titles and Multi-Platform games that started here:


    Windows Games 
Given the overlong nature of the list of games made for Microsoft Windows, this list has been divided into four categories emcompassing different eras, from the DOS-based origins to the different Windows NT iterations available to consumers.

Alternative Title(s): IBMPC