Pitied was the poor PC of 1983-1984, a relic of an era when upgrading entire systems was the norm. Initially, it didn’t boast the impressive visual capabilities that typically come to mind when discussing powerful graphics. IBM’s machines, like the IBM PC 5150, were once the talk of the business world, but they were limited by text-only displays or low-definition bitmap graphics.
The most iconic color graphics resolution proved to be 320 x 200 pixels on a CGA graphics card, which featured a limited yet distinctive palette of four colors selected from the original 16 available hues. Despite being one of the best options available, it’s challenging to argue that this graphics card is truly the perfect choice for any buyer. Moreover, a closer examination reveals that three of these colours have been anchored by cyan, brown, and magenta, while half of them have been subtle variations of their opposites.
By the mid-1980s, IBM’s CGA card had become laughably outdated. In the early days of home computing, even modest systems like the Commodore 64 could display vibrant 16-color graphics, while Apple was poised to introduce the Apple IIc, capable of rendering stunning 560×192 resolution with a palette of 16 colors. IBM introduced the Monochrome Display Adapter (MDA), which initially supported high-resolution monochrome text but not increased pixel density. In 1984, IBM launched the Enhanced Graphics Adaptor (EGA), a revolutionary graphics card that transformed the PC landscape.
The primary EGA graphics card
The EGA (Enhanced Graphics Adapter) card was a distinct, non-obligatory add-in-card designed specifically for the IBM PC/AT, which utilized the standard 8-bit Industry Standard Architecture (ISA) bus and featured built-in support on the motherboard’s new model. Earlier IBM PCs often required a BIOS upgrade to enhance their functionality.
Measuring over 13 inches in length, the complex device housed dozens of specialized large-scale integration (LSI) chips, memory controllers, memory modules, and precision crystal timers to ensure seamless synchronization. Although equipped with 64KB of onboard RAM, this device can potentially be expanded up to 192KB through the addition of a Graphics Reminiscence Expansion Card and a subsequent Memory Module Component.
Initially developed for compatibility with the IBM 5154 Enhanced Shade Show Monitor, these inaugural EGA playing cards ensure seamless integration while also maintaining functionality with contemporary CGA and MDA displays. IBM achieved consistency by standardizing on a single 9-pin D-Sub connector and introducing four DIP switches at the back of the card to enable users to select their preferred monitor type.
EGA marked a significant leap forward from the limited, low-resolution, and color-poor CGA technology. With EGA, you can potentially push graphics up to a resolution of 640 x 200, or even 640 x 350. One-sixteenth of the available 64 colors on the display can be accessed directly from the palette, resulting in 16 distinct hues being viewable at once. In the early days of computing, the idea that PC’s graphics capabilities could impress anyone who owned an 8-bit home computer system is almost laughable, as EGA and the 286 processor propelled the PC/AT back into the game.
In this screenshot from The Secret of Monkey Island, a striking contrast emerges between 16 colours in EGA (on the left) and four-color CGA (on the right), highlighting the significant visual impact of just 12 additional hues.
The worth of EGA
Despite its innovative features, the EGA graphics standard suffered from a significant drawback: it was excessively expensive, even by the lofty standards of the time, when computers were already astronomically pricey. The core EGA card value exceeded $500 (approximately $1,400 in today’s terms), with the additional Reminiscence Growth Card boosting this total to $699.
The system can allocate its total 192KB of RAM, and you’ve invested nearly $1,000 – equivalent to roughly $2,900 in today’s money – to acquire a high-end EGA card, making it significantly more expensive than the GeForce RTX 4090. Additionally, the monitor you wanted to capitalize on its value is worth an estimated $3,550. The Early General Aviation (EGA) movement was a passion-driven endeavour of affluent individuals, driven by their enthusiasm for flight.
Notwithstanding the preliminary card’s complexity, the underlying design and intricate input/output mechanisms proved surprisingly straightforward to resolve. Inside a 12 months, a smaller firm, Chips and Applied sciences (C&T) of Milpitas, California, had designed an EGA-compatible graphics chipset.
IBM condensed its extensive chip lineup into a more compact range, allowing for a smaller, cost-effective board to accommodate it. The primary C&T chipset launched in September 1985, and inside an extra two months, half a dozen firms had launched EGA-compatible playing cards, together with the ATi card pictured beneath.
By 1986, numerous chip manufacturers had created their own proprietary EGA clone chipsets and add-in-boards, with more than two dozen companies actively marketing these products, collectively capturing a significant 40% share of the burgeoning graphics add-in-card market. Originally a renowned entity known as Array Expertise Inc., which was subsequently rebranded as ATI, ultimately being absorbed into the AMD fold. As a member of the crimson team in the midst of the intense GPU battle, your journey starts now.
EGA gaming
The Enhanced Graphics Adapter (EGA) had a profound impact on PC gaming. Although PC video games existed prior to EGA, numerous titles were either text-based or designed to accommodate the fundamental constraints of CGA. EGA enabled developers to craft immersive and visually appealing PC games, including those with impressive graphics and engaging gameplay.
The erosion of trust took time to unfold. By 1990, the cost of 286 PCs, EGA playing cards, and monitors had allowed EGA graphics assistance to gain traction four years earlier than in 1987, when it first gained popularity. But EGA helped to spur on the rise and growth of the PC RPG, together with the legendary SSI ‘Gold Field’ collection of Superior Dungeons & Dragons titles, Wizardry VI: Bane of the Cosmic Forge, Would possibly and Magic II, and Ultima II to Ultima V.
The EGA’s capabilities also spawned a fresh wave of visually appealing point-and-click adventure games, such as Roberta Williams’ King’s Quest II and III, as well as The Colonel’s Bequest, as illustrated in the below screenshot of the latter. EGA enabled LucasArts to pioneer the development of groundbreaking point-and-click adventure games like Maniac Mansion and Loom, successfully rendering them in 16 colours. While many video games were content with a 320×200 resolution, some titles, such as SimCity, could benefit from the enhanced visuals of a 640×350 display setting.
What’s more, EGA made actual motion video games on the PC a sensible proposition? The Commander Keen series of video games demonstrated that PCs are capable of efficiently handling scrolling 2D platformer gameplay. Porting Apple II games like Prince of Persia would allow for more visually appealing experiences, free from the limitations of four-color graphics.
When John Carmack, the renowned coder behind Commander Keen, embarked on crafting a revolutionary 3D sequel to the iconic Catacomb series of dungeon crawlers, he conceived something truly groundbreaking. Carmack’s inaugural foray into 3D programming came with Catacomb 3-D and its sequel, Catacomb: Abyss, which introduced him to the challenges of developing a texture-mapped 3D game engine, thus laying the groundwork for the first-person shooter genre.
Although the Early Graphics Adapter (EGA) had inherent limitations – particularly the absence of vibrant colors and palettes – skilled artists could still achieve remarkable results by exercising creative restraint, leveraging the available 16 hues to craft richly detailed game environments that transported players to new and imaginative realms.
The decline of EGA graphics
EGA’s fleeting reign at the pinnacle of graphics technology was a mere blip on the radar of innovation. Computer systems continued to evolve, with Commodore launching the Amiga in 1985, featuring support for a remarkable 64 colours in video games and up to 4,096 hues in its innovative HAM mode. Because it introduced EGA, IBM was unveiling a range of new, high-end boards, including the IBM Skilled Graphics Controller (PGC), capable of rendering screens at 640×480 resolution with 256 colors from a total palette of 4,096.
Although PGC’s initial price tag was steep and targeted the sophisticated CAD community, its influence laid the groundwork for the subsequent VGA graphics standard, introduced alongside the IBM PS/2 in 1987. The VGA supported a maximum resolution of 320×200 pixels with up to 256 colors. The introduction of this innovation proved to be exactly what was required to usher in a fresh era of work methodologies, features, and PC gaming experiences.
The fact that VGA remained expensive until the early 1990s allowed EGA’s lifespan to be prolonged? Although VGA became widely adopted, numerous video games persisted in functioning at a rudimentary level, tolerating a limited 16-color EGA display. While the EGA processor arrived before the PC gaming era’s heyday, its influence preceded the dawn of greatness in the field.
We’re thrilled you enjoyed this nostalgic look at 16-color gaming’s early days on the PC. For more vintage PC gaming content, explore our piece on the first 3dfx Voodoo graphics card and our comprehensive guide on building a retro gaming PC, which walks you through all the hardware needed to assemble a DOS gaming rig.