Objects in Space - PC Monitor Evolution
What’s the real difference between PC gaming and every other platform? Is it control systems, graphical power, genres of play, price? No: for each of those there is an equivalent. The real difference is variable screen resolution. PC gamers have suffered,
IBM’s new Color Graphics Array (sic) adaptor destroyed this with a massive boost to 300x200.
this point, it’s pretty difficult to argue that the entire gaming industry hasn’t been driven by advances in open-platform PC technology. Everything from the USB plugholes in your PlayStation 4 to the Bluetooth and low-powered GPU that makes the Nintendo Switch even workable at all, all come from PC.
The Switch is an especially good example, because unlike the big box consoles, it has to be able to run every single game in at least two resolutions, specifically 1280x720 (in handheld mode) and 1920x1080 (in docked-mode).
Dynamically switching between resolutions, while a game is running, is no trivial thing. Fortunately, Nintendo had approximately 30 years of PC experience to draw on.
It really is hard to overstate the difference between the development of the Apple II and the IBM PC. Here was Apple, being all precious and independent, depending entirely on the equivalent of electronic wizardry on the part of Steve Wozniak and the interpersonal relationship equivalent of being hit with a cricket bat with nails in it on the part of Steve Jobs. Meanwhile, IBM was spending roughly a brazilian dollars on cramming an entire mainframe into a box barely six or seven times the size of a modern PC.
Key to this was the IBM PC’s graphics output. The Apple II Plus (the current model when the first IBM PC came out in 1981) could do either 16-colour graphics at 40x48 (not a typo) or six-colour graphics at 280x192. They called that hi-res mode.
IBM’s new Color Graphics Array (sic) adaptor destroyed this with a massive boost to 300x200. Yeah take that Apple!
See, back in the 1980s, personal computers had various graphics modes. Mostly, they split between text-only, standard detail graphics, and high detail graphics.
IBM’s CGA adaptor ran a whole bunch of confusingly different modes and did astoundingly arcane stuff to the output signal to make it work on a TV. Interestingly, the original output standard was digital - but only worked on a specialised IBM monitor. An RGBI monitor [RUNTIME ERROR - OBVIOUS JOKE MISSING LINE 84]. Remember: all this stuff cost not thousands, but tens of thousands of dollars on release.
Anyway, as with so much PC tech, the creation of the CGA adaptor created economies of scale that dropped the prices on all kinds of components, which let engineers do more, which led to a next generation of kit - in this case, the Extended Graphics Adapter.
The EGA was mindblasting in 1984. I mean we’re talking 640x350 with 16 colours, from a palette of 64 colours. That means it could only show 16 colours at once, but had 64 to choose from. That’s so many colours! That’s a two-layer pencil box of colours!
This was also the point where third parties started to mess around with the open nature of the “IBM compatible” PC. It was also where we first saw the 4:3 aspect ratio in 640x480.
But it was in 1987 when things got really serious. You might think of “VGA” as those old blue plugholes on school computers, but it stands for Video Graphics Array (yep, so back to “VGA adapter”, thanks IBM).
VGA cemented the 640x480 16-colour resolution as the standard, one that’s still used today. But it could also do 320x200 with
an astonishing 256 colours. Sounds low-res? Remember: everyone was using 14 or 15-inch CRT monitors. You could see the pixels but it still looked crisp. Like Minecraft running in 4K.
Because 1987 was when the clone market for PCs really started to take off, VGA became the standard, well, standard for PCs, and 640x480 became the first widely implemented standard for “good” PC graphics.
EVERYONE WILL BE SUPER, AND THEN EVERYONE WILL BE
The year 1987 was important for PC gaming in so many ways. Because it was the year when serious numbers of PC clones started to appear, it also marks the point where IBM threw up its hands and said “Good luck guys!” to the industry.
From 1987, the future of PC gaming would no longer be defined exclusively by IBM. Sure, Intel would shoulder much of the burden, but other consortia and standards organisations began to spring up. Particularly VESA - the Video Electronics Standards Association.
As we began to rush headlong into the nutso 90s, having a “VESA compatible” graphics chip made configuring games like Wing Commander and the various Kings Quests so much easier.
VESA championed a standard called, simply, Super VGA. Super VGA wasn’t a specific kind of adapter, but came eventually to mean anything that built off the VGA standard.
For gamers, SVGA meant a resolution of 800x600, a colour depth of 256, and (at first) framerates solidly in the teens.
What you’ll notice about 640x480 and 800x600 is that both these resolutions have an aspect ratio of 4:3. Nothing about SVGA was ever set in stone, but monitor manufacturers needed to be confident their products would work with any of the increasing number of PC brands, and those PC makers could no longer rely on customers buying a (usually expensive) brand-matched monitor.
PC didn’t have Apple’s luxury of controlling all the hardware, top to bottom, so Super VGA was a complicated standard that could, ultimately, support resolutions all the way up to 2560x2048. Which you would only ever see on a gigantic 21-inch CRT monitor mostly likely built by Sony and owned by your mate’s dad who was into CAD in some way.
Of course the maths-wonks among you will realise that 2560x2048 is still 4:3. But the monitor you play your games on today isn’t 4:3. So what’s up with that?
ALL HAIL THE GREAT GOD TELEVISION
The short version is that TV manufacturers looked at all the work done by PC engineers, looked at new-fangled LCD TFT tech, and said thanks very much we’ll take it from here. At least, standards-wise.
Actually it’s a little fiddlier than that. There are tedious reasons why a cathode ray tube monitor is much much cheaper to build in a squarish shape than a rectangular shape, but this didn’t apply, in the same way, to LCD.
Like almost everything, liquid crystal displays got their big break on PCs. I remember reviewing a 19-inch 1600x1200 LCD in 2000. I forget who built it (probably Philips, it was a weird time) but I’ll never forget the RRP: $9,999. And ghosting? Man, that thing ghosted like Paranormal Activity XI: Amityville.
LCD manufacturing is all about making a giant thin film transistor sheet and cutting it up to give you the most panels as possible. At some point, someone figured out you could take a 19-inch 1280x1024 display and lop off the bottom to make it a 1280x720 17-inch widescreen display.
Don’t market it as a cut-down 19-inch though. Market it as an extended, wide 17-inch display! You get more panels from the TFT sheet, and you can charge more because it’s widescreen!
This worked brilliantly right up until the release of Bioshock, when PC gamers on widescreen monitors playing next to gamers on 4:3 monitors realised - to their horror - that the old-school 4:3 guy could see more of the environment! Bioshock wasn’t extending the game to the left and right! It was lopping it from the top and bottom! Cue outrage!
Me, I say at least it was still running at 16:10. Consumer electronics had yet to deliver the final insult, the unkindest cut of all.
But unkindly cut it eventually did. The first truly affordable mass-market widescreen “Full HD” gaming monitors, appearing around 2009, didn’t run at 1920x1200. A 16:10 1920x1200 display was expensive. I’m not ashamed to say I spent $1200 on a Dell once. That screen lasted more than a decade, and I think I played my role in helping push prices down.
But I do feel bad that, even as editor of this mighty magazine (well, ex-editor at the time, it was about 2005), I was unable to stop the rot, to prevent the insult that is 16:9.
See, those first full HD monitors were 1920x1080. That’s the same resolution as a [spits] television. We were robbed of 120 vertical lines of resolution by digital programming. By 7Mate.
That’s right. After nearly 30 years we’ve come full circle. From using weird RF adaptors to make our TVs work as computer monitors, we now have computer monitors that are basically fancy (and tiny) TVs.
I guess 4K makes up for it. Just not very much, is all.
You may scoff, but the Switch has benefitted from decades of PC evolution.
This is your video card’s great, great, great... grandparent. Bioshock showed gamers exactly what they were missing out on ...