THE 32-Bit ’90s A decade when computers would adopt desktops and multimedia
Previ ous versions of Windows were unsuccessful, but with 1990’s Windows 3.0, the PC desktop was seen as a viable alternative to the Macintosh and Amiga. Windows 3.0 had a new interface, multitasking abilities, and mouse-driven productivity suites that freed users from the command line.
Meanwhile, IBM’s OS/2 had been trying to establish itself as the respectable GUI for corporate America. By 1990, the alliance between IBM and Microsoft had essentially finished, with the two becoming rivals. Although newer versions of OS/2 would be more advanced, for now Microsoft had the technological advantage. IBM was still hampered by 286 machines, keeping OS/2 primarily 16-bit, unable to use the advanced features of the 386.
April 1992 finally saw OS/2 become 32-bit. In most ways, it was superior, with extensions to DOS, and Windows 3.x support in a stable environment. But while Windows targeted clone machines, OS/2 targeted IBM hardware, so it couldn’t run on many clones where Windows ran perfectly. Furthermore, while IBM sold OS/2 as a separate product, Microsoft bundled Windows with new PCs.
Microsoft’s dominance started with Windows for Workgroups 3.11 in August 1993. It had new 32-bit capabilities and proper networking. It devoured the business space, and 3.11 would be the environment many people grew up with.
THE MULTIMEDIA AGE
In the mid-’90s, every PC had a soundcard, CD-ROM drive, and tinny set of multimedia speakers. CD-ROM’s 650MB of storage allowed more expansive gaming, with FMV cutscenes and CD-audio soundtracks. Schools bought edutainment packages with archived video and interactivity.
By now, the 486 was standard. Although 386s were still functional business machines, you needed a 486 to enjoy this era. Thankfully, hardware prices fell dramatically; while ’80s PCs usually had Intel CPUs, rival manufacturers were on the ascent and lowering costs.
Although AMD CPUs were often from a previous generation to Intel’s, its chips were more efficient and allowed higher clock speeds, giving similar performance at much lower prices. Cyrix was making a name for itself with 486-upgrade processors, providing a cheap upgrade route for 386 owners with a new CPU in their old motherboard.
1993’s Intel Pentium brought the next generation of CPUs. Intel dropped the “86” to differentiate itself from other manufacturers, with “Pent” coming from the Greek “penta,” meaning five (implying a 586 without saying it).
The Pentium gave almost twice the performance per clock cycle as the 486, but early Pentiums were only 50–66MHz. Meanwhile, AMD was pumping out insanely overclocked 486s, such as the DX4-120 running at 120MHz, nearly matching early Pentiums. AMD’s strong performance and low prices attracted manufacturers such as Acer and Compaq, whereas Cyrix’s efficient designs caught IBM’s eye, starting a partnership in 1994.
1995 saw the introduction of the ATX standard we use today, defining new mounting placements and features like automatic shutdowns. Unlike XT and AT, this change was brought by Intel instead of IBM.
August 1995 would see the biggest change to the computing landscape yet: Windows 95. On the technical side, Windows 95 was designed around 32-bit preemptive multitasking, compatibility with existing DOS and Windows programs, and new tech such as DirectX and Plug and Play support. But the real change was the interface. A taskbar, a “Start” button in the bottom-left, the “Maximize,” “Minimize,” and “Close” buttons at the top-right of the window.... We take these norms for granted now, but they started with Windows 95.
Windows 95 truly established the Microsoft goliath. Computing had become mainstream, and Microsoft was a household name. It was over for competitors: Commodore had gone bankrupt, Atari hit the wall, and Apple was barely surviving. IBM still had OS/2, with its newer Warp release from a year prior, but this
only supported Win 3.x applications and sank into irrelevancy.
When Windows 98 arrived, it fixed many of the teething problems of Win 95, with a more stable system, better hardware support, and UI enhancements. This was also when the anti-trust lawsuits began, as Microsoft bundled Internet Explorer with Windows, itself already bundled with new computers. Now Microsoft would dominate not just PCs, but Internet browsers too.
THE PERFORMANCE AGE
3D accelerator cards—such as 3dfx’s Voodoo 2, Nvidia’s Riva TNT, and ATI’s Rage series—would be a defining feature of the late ’90s. 3D acceleration brought a new era of PC gaming. Where previous games relied on the CPU for all rendering, these new graphics cards added a GPU (graphics processing unit), which took the graphical processing burden away from the CPU, allowing substantially faster gaming and stunning graphical effects.
Although 3dfx tried to corner the market with its proprietary Glide API, it eventually lost out to competitors who used market standards such as DirectX and Silicon Graphics’s OpenGL. The ultimate card of the ’90s would be 1999’s Nvidia GeForce 256.
This point is where the CPU race is whittled down to AMD and Intel.
Until now, things looked great for Cyrix. The mid-’90s saw 5x86 upgrade chips for 486 machines, followed by the 6x86 in October 1995. The 6x86 out-performed mid-level Pentium machines for less money—Cyrix was becoming a technological leader rather than just a budget manufacturer.
Business was good until complex 3D games such as Quake uncovered Cyrix’s embarrassing floating point and integer performance. Cyrix was great at spreadsheets, but terrible at gaming, which tarnished the brand. 1997’s MediaGX helped improve things, with a system-onchip design perfect for laptops and the budget PC market, but as Intel continued to advance, Cyrix did not.
Newer-generation CPUs were really highly overclocked 6x86s— prone to high failure rates, still poor at gaming. The Cyrix-IBM partnership ended in 1998, and worse yet, Intel soon entered the budget market with its Celeron line. Cyrix was out of cash, and its tech was bought out by VIA in 1999, who gradually phased out the brand.
AMD, meanwhile, went from strength to strength. During the Pentium era, reverse-engineering Intel’s processors became too complex, so AMD started designing its own style of processors, rather than follow Intel designs.
In 1996, AMD released the K5, the first Pentium rival, but 1997 brought true success with the K6. This was a proper rival to the new Pentium II, but could work in older Socket 7 motherboards. The K6 series was wildly successful, with its famous 3DNow! instructions, and cheaper prices. The successive K6-2 and K6-3 chips continued to rival advancing Pentium II and III models, and would eventually dominate most of the sub-$1,000 market.
We would end the decade with 1999’s K7 Athlon, the first retail CPU to break the 1GHz mark.
The ’90s were a time of survival of the fittest, ending with one dominant OS and two CPU makers. Thankfully, the GPU market still had a few years of diversity remaining.