Useful Notes / IBM Personal Computer

Born in the wake of the Apple ]['s success, the IBM Personal Computer (dubbed the "5150" in IBM's internal numbering system) was IBM's official entry into the desktop computer system market, and by far their most successful. Earlier attempts, like the 5100 desktop APL machine and the DisplayWriter word-processing machine, hadn't taken off, and IBM needed something fast to compete with Apple. Bypassing the usual IBM bureaucracy, in 1980 they tasked a team of engineers in an IBM office in Boca Raton, Florida with developing the new machine and gave them an unusual amount of freedom in developing the system.

What appeared in August 1981 was nothing like any IBM machine built before. Like the Apple II, the IBM PC was built almost completely out of off-the-shelf parts and had a generous amount of expansion capability. As for the system design, the Boca Raton team considered several processors (including IBM's own ROMP CPU note  and the Motorola 68000) before settling on Intel's 16-bit 8088. The 8088 was chosen mainly for cost and time-to-market reasons; the ROMP was still experimental and IBM was concerned that the 68000 wouldn't be available in quantity. Also, the 8088 could re-use many of the support chips Intel had designed for the 8085, making the motherboard design simpler. To ensure a steady supply of 8088s, IBM and Intel recruited Advanced Micro Devices (AMD) to act as a second source, a decision that would have some importance later.

The other big influence on the IBM PC's design was the world of S-100 machines, which were based around the Intel 8080 (or, later the Zilog Z80) and the "S-100" bus that had been introduced in the pioneering Altair 8800. These machines ran an OS called CP/M, which had been invented by a programmer named Gary Kildall in 1974 and was based indirectly on Digital Equipment Corp.'s RSTS and RT-11 OperatingSystems for the PDP-11. While they weren't nearly as slick as the Apple ][, S-100 machines were popular with hobbyists and businesses alike, and several CP/M applications for businesses, like WordStar and dBASE, were making inroads.

S-100 machines were large, server-style boxes with a large amount of slots inside, plugged into a central backplane with power and data signals on it. The cards themselves were large and nearly square. To save space, IBM decided against using the S-100 backplane system, and instead went with Apple II-style cards that were long and rectangular, with a 62-pin edge connector near the back end of the card. IBM also added a sheet-metal bracket to the back of the card to add some structural stability. Since the PC used a regulated, switching power supply, the hot-running secondary regulators that S-100 cards used were also no longer necessary.note 

In one break with the Apple II's precedent, and as an improvement on the serial consoles S-100 machines used, IBM decided to leave the graphics system off the motherboard and provide two add-on cards — a text-only Monochrome Display Adapter (MDA)note , intended for business users, and a Color/Graphics Adapter (CGA), for games, education and emulating other color-capable IBM hardware. This was done to allow buyers a choice in the video hardware, as well as to save space on the motherboard. While MDA was widely praised for its outstanding clarity and readability, especially when combined with IBM's original 5151 monitor, which showed off MDA's effective 720×350 resolution, CGA had a barely adequate 320×200 (with strange, unnatural-looking palettes, to boot) and a distorted, monochrome-only 640×200 and was nearly universally panned. Still, it wasn't until the advent of EGA in 1984 that anything more adequate appeared, so everybody still used it — or a third-party Hercules monochrome card, which could address individual pixels, unlike the MDA, but was much more expensive.

The base system came with just 64K, like the Apple II, but could be expanded to a then-breathtaking 640K thanks to the Intel 8088 processor inside, which had a 1 MB address space (huge for a desktop machine in 1981). It even had BASIC in ROM, just like the Apple II.

IBM followed up the PC with the XT in 1983, which removed the original PC's cassette interface and made a hard drive option available. 1983 also saw the introduction of the PCjr, a severely crippled version of the XT intended for home use; its main claims to fame were the addition of a 16-color, 320×200 graphics mode and an internal 4-voice PSG (the same TI model used in the ColecoVision). Next was the PC/AT in 1984, which introduced the 80286 processor and a fully 16-bit architecture, along with the Enhanced Graphics Adapter (EGA), which finally made 16-color graphics (in resolutions all the way up to 640×350, although 320x200 and 320x240 were the most popular resolutions) possible on a regular PC. And with that, with the capacity for attractive applications and especially entertainment software, the march of history began...

    open/close all folders 

    The Rise Of The Clones 
At first, the IBM PC didn't have much to offer home users and gamers. It was new, expensive, not as good with graphics as the Apple ][ or the Atari 800, and was directed squarely at business users. However, IBM's name on the machine made it a safe buy for businesses that already used IBM hardware, and they ended up buying the machines in droves. The machine's open design sparked a huge third-party expansion market, with dozens of vendors selling memory expansion boards, hard drive upgrades and more. It wasn't long until other computer makers started examining the PC's design and figuring out how to make clones of the machine that could run PC software without issues. The one thing stopping them, however, was the ROM. IBM had a copyright on what they called the "ROM BIOS", and while cloning the hardware was easy, cloning the ROM would be much harder, with few vendors able to get it completely right. It wasn't until Compaq introduced the Portable in 1983 that a truly 100% IBM compatible PC was available, and after that, software houses such as Phoenix and AMI followed suit, opening the floodgates to an entire industry of low-priced PC compatibles.

IBM also had another problem to deal with: Microsoft. When the PC was first being developed, IBM decided they wanted to license an outside OS rather than attempt to write their own, and their first choice would have been CP/M. However, when they tried to meet with Gary Kildall to license it, he wasn't around to sign the papers; the full details are unclear and have become something of a legend, but in the end, IBM didn't get CP/M. What they did get was the product of another little-known Seattle software developer's own frustration with CP/M: MS-DOS. MS-DOS began life as an admittedly "quick and dirty" clone of CP/M, written by a developer named Tim Patterson at Seattle Computer Products.

SCP was mostly a hardware outfit, whose business was in memory upgrades and other add-ons to the aforementioned S-100 machines. When the 8086 appeared on the market, they wanted to use it and quickly threw together a prototype machine. Digital Research had promised an 8086/8088 port of CP/M for a long time and never delivered until it was too late, leading Patterson to write his own and name it "86-DOS" or "Q-DOS" (depending on who you ask). Microsoft, who had already basically lied to IBM and said they had something ready (they did; it was called Xenix—but it was a UNIX OS, and IBM wouldn't accept that. Xenix was later sold to the SCO Group), paid SCP and Patterson $10,000 for the rights to 86-DOS, did some quick editing and released that as MS-DOS/PC-DOS 1.0. Microsoft also put language in their contract with IBM, stating that they had the right to license MS-DOS to whoever they wanted without first seeking IBM's approval. This had serious implications for the PC clone business, as it meant that once the clone makers, AMI and Phoenix had opened the floodgates on the hardware side, Microsoft could sell the hardware makers MS-DOS, thus creating a complete package — and a huge pain for IBM.

While Microsoft intended MS-DOS to be a universal operating system where applications could be written once and run anywhere, its programming interface was so poor that many software developers bypassed it and directly accessed the hardware. A number of manufacturers did introduce MS-DOS-based computers, they all failed in the marketplace because they weren't fully IBM compatible.

Lotus 1-2-3, a spreadsheet that was the IBM PC's Killer App, was praised for its speed because it had been tightly coded in assembler. This meant that any clone would need to be as compatible with the hardware as possible in order to run Lotus 1-2-3, which became a litmus test of PC compatibility. This would have consequences for the evolution of the platform's hardware, particularly the "640k barrier."

"IBM compatible" became to the personal computer industry what VHS was to the VCR in the '80s. While other platforms were better technically, it was almost impossible to beat the PC's hardware and software compatibility and the PC (or more importantly MS-DOS on x86 processors) became a de-facto standard everyone rallied around. The earliest clone manufacturers were Eagle, Columbia Data Products and Compaq. A number of small mom-and-pop operations sprang up using cheap parts from Asia.

In terms of gaming, the Tandy 1000 was the turning point. A clone of the ill-fated IBM PCjr, the Tandy 1000 featured greater PC compatibility and expansion using standard expansion cards instead of sidecars while keeping the enhanced graphics and sound. With the porting of King's Quest, it proved that the PC could be a viable gaming platform. The Tandy 1000 and other cheap clones such as the Leading Edge Model D shifted the home computer market in the U.S. away from the Commodore 64 to the PC in the latter half of the '80s. Such was the strength of the PC market that even Commodore and Atari came out with their own PC clones.

    IBM Tries To Win Back The Crowd 
With all of the pieces in place, the clone market took off like a shot after 1984. New companies building PCs based on cheap, mass-market "motherboards" made in factories in East Asia were popping up everywhere, and Compaq became a Fortune 500 company on the success of its Portable and Deskpro ranges. In 1986 Compaq beat IBM to the punch with the first PC to use the new, 32-bit 80386 processor. Between the clone armies and Compaq's meteoric rise, IBM decided that if it couldn't compete on price, it would compete on features, and (it was hoped) introduce a new standard that they alone would control.

The result was the Personal System/2 (or PS/2 for short), a line of new PC-based machines that were deliberately much different from the prevailing PC standards. The new machines used a new, IBM-proprietary expansion bus called "Micro Channel", which was faster than the AT's bus (by now referred to as "ISA" for "Industry Standard Architecture", since IBM claimed a trademark on "AT") but completely incompatible and protected by IBM patents, requiring anyone who wanted to use it to go through a lengthy licensing process and pay royalties. The other major feature the PS/2 line introduced was a new video subsystem called Video Graphics Array (VGA), a substantial upgrade to EGA that added a new 640×480 high-resolution mode (familiar now as the mode Windows 2000 and XP use for their splash screens), analog RGB video with an 18-bit palette (over 262,000 colors), and up to 256 colors on-screen at once. VGA was accepted by the rest of the industry enthusiastically, with 100% VGA compatibility becoming a must for video-card makers. note 

The VGA proved very popular with game developers. What it lacked in tricks like sprites, blitting and scanline DMA, it compensated for by being tweakable (hacked 256-color modes were very popular, providing resolutions up to 360×480) and having high-speed, easy-to-use video memory. The base 320×200×256 mode, however, was the easiest and the fastestnote , and many groundbreaking games of the late 1980s and early 1990s were written with this mode in mind, including the Sierra and LucasArts point-and-click adventures, Wolfenstein 3D, and Doom. 640×480×16 mode, on the other hand, was extremely popular with the early graphical OSes and GUI-based DOS software, and it remains a barebone compatibility video mode in many modern OSes as well. Later IBM innovations, like 1024×768×16 XGA, were a few more-or-less standard modes in the swirling chaos generally known as "Super VGA".

While most PC gamers had been content with the old beeping speaker outside of those with the Tandy 1000's three-voice sound chip inherited from the PC Jr., sound was rapidly improving in the late '80s as well independently of IBM. The AdLib and Roland MT-32 sound cards added synthesized music to PC games and were widely supported in the late '80s and early '90s, but the Creative Labs Sound Blaster, introduced in 1989, added digitized sound. This, combined with AdLib compatibility made it an irresistible add-on for serious PC gamers.

    Local Bus Wars 
Micro Channel didn't fare so well, though. Only a handful of other PC makersnote  adopted the bus, and while a few outside peripheral makers made MCA-compatible devices, the vast majority were designed and built by IBM as build-to-order add-ons. Miffed at IBM's attempt to corner the high-end PC market, Compaq and several other PC makers introduced a competing standard called "extended ISA" or just EISA for short. EISA expanded the bus to 32 bits and added "bus mastering" support (which let the CPU do other things while a data transfer from, say, a disk was happening)note  and MCA-style, semi-automatic configuration, while maintaining compatibility with regular ISA cards. EISA was popular mainly on servers and high-end PCs; desktops didn't need that kind of bandwidth yet. Its auto-configuration system was later backported to ISA as "Plug and Play", as part of the development effort leading up to Windows 95.

In the mid-1990s there was another competitor bus, VESA Local Bus (or VLB for short), which added a 32-bit wide, full-speed side channel to the ISA bus. VLB was originally designed to give bandwidth-hungry GPUs a faster connection to the system bus, but it was also something of a stopgap measure and didn't last long. Its biggest problem was that it was too deeply connected to the internals of the 486 processor, for which it was developed; the Pentium used a completely different memory bus setup, and converting between the two was notoriously difficult. Also, VLB's specification was not very rigid and almost all manufacturers tweaked it a bit. This lack of precision also made running anything other than video on VLB a potentially dangerous proposition, especially if Mass Storage was involved; most IDE controllers of the day generated their timing signals directly from the VLB, and if it was running faster than what the controller expected, bad things would happen. VLB also had mechanical problems. Physically, it was a high-density edge connector positioned next to (and on the far side of) an ISA-16 slot, so it could be used with full-length cards only (giving it a Fan Nickname of Very Long Bus); since many cases of the day weren't designed with full-length cards in mind, inserting a VLB card was often quite difficult and risked damaging the card, the motherboard, or the case.

In the end, Intel's new "Peripheral Component Interconnect" (PCI) spec won the "local-bus wars" due to its platform-agnostic nature and cleaner architecture. PCI was first announced in 1992, and became popular with the first PCI 2.0 machines in 1995, making VLB obsolete almost immediately. PCI used a single, 124-pin, high-density edge connector (much like 16-bit MCA), which made ISA-style short cards possible. Unlike EISA, PCI wasn't backwards-compatible with ISA electrically, but PCI cards could co-exist with ISA and VLB, and Intel deliberately made the specification easy to implement, meaning that a lot of the barriers to acceptance that MCA had were gone. It was also designed for future expansion, with "fast" and "wide" versions (66 M Hz, 64-bit) designed into the spec instead of being third-party tweaks. PCI spread quickly, showing up on non-Intel-based machines like Macs and Suns, and was eventually deemed MCA's successor by IBM in the late 1990s.

    Wintel Comes And Wins 
After years of being confined to what were basically fleet sales, IBM discontinued the PS/2 line and MCA in the mid-1990s, preferring instead to concentrate on the revived "IBM PC" brand (new, ISA/PCI-based machines sold as business desktops) and the highly successful ThinkPad line of notebooks, which was introduced in 1992. This marked the end of IBM's dominance of the PC clone market, with the balance of power now shifted to Microsoft, Intel and the clonemakers.

The introduction of Windows 3.0 in 1990 also finally made Windows a legitimate platform after several years of false starts; it placed higher demands on both graphics hardware and Mass Storage, and it was this need for better hardware that drove PC development. With the introduction of PCI between 1993 and 1995, along with improved video, sound and storage hardware, the PC started to look less like a classic 8-bit computer with bolted-on upgrades and more like a high-end RISC workstation. The introduction of the second-wave Pentium in 1995 and the Pentium II and AMD K6 in 1997, along with the ACPI API in 1998, blurred the distinctions even further, and convinced people that a cheap desktop could perform as well as an expensive UNIX workstation. AMD sweetened the deal further in 1999 with the announcement of the "x86-64" instruction set, which added 64-bit capability to the PC and fixed some of the 80x86's long-standing quirks; the first CPU to feature it was the Opteron, released in 2003.

This caught Intel in a bad spot. Intel had been working with Hewlett-Packard on a new, 64-bit processor called "Itanium", and had pretty much bet the company on it; much like what Compaq, DEC and Microsoft had wanted to do with MIPS, and what Apple, IBM and Motorola had wanted to do with PowerPC, Intel had made ambitious plans to discontinue the Pentium and move the entire PC platform over to Itanium by the mid-2000s. The Itanium was not a regular CPU. It used a new architecture based on "very long instruction words", small programs in themselves that ran the CPU's insides more-or-less directly instead of relying on a decoder system to do it; this decoder was important in that it kept the CPU's multiple logic units and FP Us busy, a complex process that was (and is) extremely difficult to replicate in software. Because of this, the Itanium was hard to develop for, at least at first, and performance suffered because of the immature tools. On top of that, the chip was late to market and incredibly expensive once it did show up, leading to trade papers calling it "Itanic" since it seemed to be sinking fast. Despite all this, Intel stood by Itanium right up until the Opteron came out — and then changed their minds quickly upon seeing how popular it had become, working on plans to add 64-bit support to the Pentium 4 line as well as the then upcoming Core 2 processors. Since then, the Itanium has found a niche in replacing older RISC systems like DEC's Alpha, and HP's PA-RISC, and the Itanium 2 has found use in high-performance computing, but it never had the mass-market success Intel was hoping for.

    Present Day, Present Time, A-ha-ha-ha-ha… 
Today, the PC's various implementations are collectively the most popular desktop computer in the world, and have even made inroads into scientific and high-performance computing due to huge leaps in processing capability as well as an emphasis, post-Pentium 4, on power savings. Several attempts to update the PC using newer parts have come and gone; most of them failed after 1995 as the PC's hardware ended up absorbing most of the features that a switch to MIPS or PowerPC would have brought, including (eventually) the RISC philosophy itself. The Pentium Pro and its descendants are actually RISC processors internally, and use speculative execution and "macro-ops" to handle the rather haphazard x86 ISA instead of trying to execute it directly, as all processors up to the Pentium had.

On top of all this, the rise of the clones meant that pretty much everyone was selling nearly identical systems with little to differentiate them; this made margins even tighter and made PC makers more reliant on advertising, system aesthetics (the aforementioned ThinkPad was one of the first PCs to buck the trend of the generic beige box), gimmicks such as "100x CD-ROM" software and other pack-ins, and, if all else failed, price. This environment made it much more difficult for a new player to enter, as any new system would almost certainly be more expensive than a regular PC would and would have the additional hassles of porting all the software. Apple was able to stick it out due to both clever advertising and innovative system design, but they, too, eventually switched to x86 in 2006.

So far, x86 has seen off all attempts to replace it, with most of its challengers either long obsolete (6502, 68000) or relegated to embedded systems (MIPS, PowerPC, although the latter was used in Macs until the mid-2000s). However, ARM CPUs have seen massive increases in computing power since the mid-2000s, and may end up being the first real challenge to x86's dominance on the desktop in decades. Chips based on the ARM architecture have dominated consumer electronics for years because of their clean system design and low power usage (which means better battery life), whereas Intel had to cut features and performance from the Atom just to get it cool enough for netbooks. Chips from NVIDIA, Texas Instruments, Samsung and Apple have shown what ARM is capable of, and with many people switching from desktops and laptops to tablets, even Microsoft has gotten concerned — Windows 8 was ported to ARM as "Windows RT", and Windows 10 has a full-featured ARM port. There have also been rumors that Apple may be switching their desktop machines to ARM.

As a footnote, IBM themselves left the personal computer business for good in 2005, selling their PC division to a Chinese company named Lenovo (hence Lenovo now sells the Thinkpad).


Because of the platform's Long Runner status and its eventual passing into the public domain, there are literally tens of thousands of different makes, models and configurations, made by hundreds of producers (both large vendors contributing to the development of the platform and garage tinkerers assembling their own dream machines), so only notable models and some of the typical examples of an era will be presented here.

    Original IBM machines 
Note that for most of the platform's history under IBM, these computers were aimed squarely at business users, and any entertainment capabilities like sound, multimedia and accelerated graphics were left to third parties. IBM basically ignored that market entirely.


The original machine that started it all, actually a third attempt to create a PC by IBM, and the first that really took off.
  • Intel 8088 processor running at 4.77 MHz
  • Optional Intel 8087 math co-processor
  • 64K RAM, expandable to 640K; some expansion systems could backfill up to 736K if you were using an MDA
  • Five 8-bit expansion slots
  • Keyboard and cassette ports
  • Optional single or dual 5¼" floppy drives — originally single-sided, single density, carrying just 160K
  • Could be ordered with either an MDA (80×25 text only with blink, bold, underline and reverse-video effects) or a CGA (80×25 or 40×25 text in 16 colors, 320×200 in four colors, 640×200 monochrome; a hacked 160×100 16-color mode was also available)
    • CGA's 80×25 text mode was a joke. Unlike the similar MDA's mode it used 8×8 matrix for a one symbol, which, accounting for letter and row separations, left only 7×7 for symbol itself, at best. MDA's character tile resolution was 9×14, a feat unmatched at least until VGA, and allowing for a then unprecedented text clarity and readability — and not a small selling booster for the early models, which were used mostly as office machines. CGA characters, in contrast, looked ugly, grainy and were nigh unreadable. Several clone vendors (most notably Compaq and AT&T/Olivetti) remedied this by providing a double-scan text mode, which ran at either MDA's 720x350 (Compaq) and used its character matrix, or at 640×400 (AT&T, Olivetti, several "super CGA" and "super EGA" boards) and used a much more legible 8×16 character matrix.
  • It ran MS-DOS—a single-task, 16-bit Operating System that in its early days was basically a clone of Digital Research's CP/M, and didn't actually provide much in the way of an API except for a file system and limited memory management. Everything else was up to the programmer.


A somewhat improved model, introduced in 1983 and made to look less like an 8-bit toybox with bolted-on upgrades and more like its own thing. Otherwise same as the PC, except with more memory, standard floppy drives and a hard disk option.
  • A 8/16-bit Intel 8088 CPU replaced by a fully 16-bit 8086
  • Standard 128K of 256K RAM
  • Floppies became double-sided, with 320K capacity
  • 16-bit expansion slots introduced, 98 pins
  • Eight expansion slots — six 16-bit, two 8-bit
  • Several optional hard drives were offered, with 16-bit controller cards and 5M to 10M capacities
  • Cassette port removed


First major upgrade to the system, it introduced a much faster, new-generation CPU, a major boost in performance and a promise of true multitasking on an inexpensive PC.
  • Intel 80286 processor running at 6 or 8 MHz, depending on age
  • Up to 16 MB "extended" memory using add-on cards
  • 20 or 40 MB high-performance hard drive — originally still an option, but a must by the end of the 1980s
    • The "AT Attachment" (ATA, commonly known as IDE) interface started here, not as an integrated controller at first, but as the Western Digital ST-412/floppy combo controller board IBM included with the AT. Later on, Compaq asked Western Digital to make a standardized interface for the controller that didn't require all of the 80-odd data, address, control and interrupt signals present on the AT bus, and they were able to reduce it to 40 pins (all 16 data lines, plus a simplified addressing scheme, interrupts and handshaking). After making a few prototype drives with Control Data, Compaq recruited Conner Peripherals to start building ATA drives for them. By 1990, other manufacturers (including Western Digital themselves, Seagate, Quantum and Maxtor) had followed suit, switching from ST-412 and ESDI drives to ATA drives; ST-412 was dead not long after that.
  • Single or dual "high density" floppy drives — up to 1.2 MB capacity
  • Optional MDA, CGA, EGA or the rare and expensive Professional Graphics Controller, an early GPU meant for CAD use. EGA was a major upgrade over the CGA, which finally brought high resolution (640×350) color graphics to the platform. EGA could display 16 colors out of a 64-color palette, and despite being pretty awkward to programnote  it carried a whooping 64K of Video RAM and allowed a clever programmernote  to do tricks in software that were previously thought possible only on the consoles with their hardware accelerators, like smooth scrolling.
  • On the software side, it was the last model intended to run the "stop-gap" MS-DOS, with the next IBM range expected to use the new "grownup" OS—the much vaunted OS/2, which promised true multitasking and a graphical user interface.


IBM's attempt to fight the hordes of the clonemakers by releasing the much improved, but closed and patented MCA bus with an attendant computer platform in 1987. The range was met with mixed success (they were expensive, even compared to high-end competitors like Compaq, and much of their market share was from "no one ever got fired for buying IBM" fleet buyers), but some of the innovations introduced here have stuck around to this day.
  • Intel 80286 to 80486 processors in all available speeds from 8 to 66 MHz
    • IBM's decision to retain the by the time entirely obsolete 8086 and the aging and notoriously idiosyncraticnote  80286 processors in the lineup was the first step in the IBM/Microsoft OS quarrel that led to the birth of Windows NT
    • Late into the system's life a 90 MHz Pentium model was offered
  • Finally, a standard hard drive was included, both in the the IDE and a proprietary ESDI interfaces, the 8086-equipped Model 25 carried an ST506-based 5 megabyte RLL model
  • 512K (for a Model 25) to 32M memory
  • 5¼-inch floppies were replaced by the Sony-made 3½-inch ones, the same as on Macs. Unlike Apple-supplied CLVnote  variants, however, these were CAVnote , which made them cheaper and more reliable, but limited their capacity to just 720K and made the formatted floppies non-interchangeable between systems. Later, high-density 1.44M and 2.88M versions were introduced.
  • Another enduring innovation were the fast serial mouse and keyboard connectors that are to this day called just "PS/2 ports"
  • VGA graphics board was first introduced here in an MCA variant, and an ISA version was made available shortly thereafter
    • VGA would end up exploding the PC gaming world. Not only did it introduce the new 640×480×16 color hi-res mode, but it also introduced the new, incredibly rich and colorful 256-color modes, and in the pretty nifty 320×200/320×240 resolutions to boot. These were the same resolutions used by CGA and EGA, letting game developers support VGA and the older modes simply by using different palettes. Moreover, unlike the EGA, these high-color modes were very fast and straightforward, using packed pixelsnote  and fitting into the same memory segment. For the first time in history, PC graphics could seriously compete with consoles and the Amiga.
  • Not everything was as bright on the OS side, though. OS/2 was delayed, and when finally released, its original 1.0 version lacked the much-ballyhooed GUI (which was released only in version 1.2 about a year later), it couldn't run more than one DOS application (and even that not very well), it didn't work on 8086 CP Us at all, and what's more, the differing ideas that IBM and Microsoft had about its development eventually led to Microsoft ditching the project for good in favor of Windows, about midway through the development of what was intended to become "OS/2 3.0". It also worked entirely differently from MS-DOS, requiring programmers to learn the entirely new API and architecture, which contributed to the perennial lack of software for it.


Games (by no means an exhaustive list):

    Booter Games 
Games that did not use an operating system, and were run by simply inserting the floppy and turning on the computer. They lost their popularity once hard drives became more popular, allowing gamers to just store them on the machine and not have to reboot it and insert a different disk.

    DOS Games 
Games that are run through DOS. These remained popular until the mid-90's when Windows replaced DOS as the OS on most computers. Windows has tried to keep compatibility with them throughout the years; however, with Windows no longer being DOS based, along with the move to 64-bit versions of Windows that don't support older 16-bit programs, the use of an emulator (such as DOSBox) is almost required at this point.

Exclusive titles and Multi-Platform games that started here:


    Windows Games 
Games that need Windows to run. It's been popular since the late-80's, and is now a requirement for playing almost all modern PC games (with OS X and Linux support slowly becoming more popular). Theoretically, a modern machine is compatible with all of the games listed; however, many need fixes due to changes that have been made over the past 25 or so years, with the very old 16-bit ones requiring an emulator on 64-bit systems due to support being axed.

Exclusive titles and Multi-Platform games that started here: