Ob­jects in Space - PC Mon­i­tor Evo­lu­tion

What’s the real dif­fer­ence be­tween PC gam­ing and ev­ery other plat­form? Is it con­trol sys­tems, graph­i­cal power, gen­res of play, price? No: for each of those there is an equiv­a­lent. The real dif­fer­ence is vari­able screen res­o­lu­tion. PC gamers have suf­fered,


IBM’s new Color Graph­ics Ar­ray (sic) adap­tor de­stroyed this with a mas­sive boost to 300x200.


this point, it’s pretty dif­fi­cult to ar­gue that the en­tire gam­ing in­dus­try hasn’t been driven by ad­vances in open-plat­form PC tech­nol­ogy. Ev­ery­thing from the USB plug­holes in your PlayS­ta­tion 4 to the Blue­tooth and low-pow­ered GPU that makes the Nin­tendo Switch even work­able at all, all come from PC.

The Switch is an es­pe­cially good ex­am­ple, be­cause un­like the big box con­soles, it has to be able to run ev­ery sin­gle game in at least two res­o­lu­tions, specif­i­cally 1280x720 (in hand­held mode) and 1920x1080 (in docked-mode).

Dy­nam­i­cally switch­ing be­tween res­o­lu­tions, while a game is run­ning, is no triv­ial thing. For­tu­nately, Nin­tendo had ap­prox­i­mately 30 years of PC ex­pe­ri­ence to draw on.


It re­ally is hard to over­state the dif­fer­ence be­tween the de­vel­op­ment of the Ap­ple II and the IBM PC. Here was Ap­ple, be­ing all pre­cious and in­de­pen­dent, de­pend­ing en­tirely on the equiv­a­lent of elec­tronic wiz­ardry on the part of Steve Woz­niak and the in­ter­per­sonal re­la­tion­ship equiv­a­lent of be­ing hit with a cricket bat with nails in it on the part of Steve Jobs. Mean­while, IBM was spend­ing roughly a brazil­ian dol­lars on cram­ming an en­tire main­frame into a box barely six or seven times the size of a modern PC.

Key to this was the IBM PC’s graph­ics out­put. The Ap­ple II Plus (the cur­rent model when the first IBM PC came out in 1981) could do either 16-colour graph­ics at 40x48 (not a typo) or six-colour graph­ics at 280x192. They called that hi-res mode.

IBM’s new Color Graph­ics Ar­ray (sic) adap­tor de­stroyed this with a mas­sive boost to 300x200. Yeah take that Ap­ple!

See, back in the 1980s, per­sonal com­put­ers had var­i­ous graph­ics modes. Mostly, they split be­tween text-only, stan­dard de­tail graph­ics, and high de­tail graph­ics.

IBM’s CGA adap­tor ran a whole bunch of con­fus­ingly dif­fer­ent modes and did as­tound­ingly ar­cane stuff to the out­put sig­nal to make it work on a TV. In­ter­est­ingly, the orig­i­nal out­put stan­dard was dig­i­tal - but only worked on a spe­cialised IBM mon­i­tor. An RGBI mon­i­tor [RUN­TIME ER­ROR - OB­VI­OUS JOKE MISS­ING LINE 84]. Re­mem­ber: all this stuff cost not thou­sands, but tens of thou­sands of dol­lars on re­lease.

Any­way, as with so much PC tech, the cre­ation of the CGA adap­tor cre­ated economies of scale that dropped the prices on all kinds of com­po­nents, which let en­gi­neers do more, which led to a next gen­er­a­tion of kit - in this case, the Ex­tended Graph­ics Adapter.

The EGA was mind­blast­ing in 1984. I mean we’re talk­ing 640x350 with 16 colours, from a pal­ette of 64 colours. That means it could only show 16 colours at once, but had 64 to choose from. That’s so many colours! That’s a two-layer pen­cil box of colours!

This was also the point where third par­ties started to mess around with the open na­ture of the “IBM com­pat­i­ble” PC. It was also where we first saw the 4:3 as­pect ra­tio in 640x480.

But it was in 1987 when things got re­ally se­ri­ous. You might think of “VGA” as those old blue plug­holes on school com­put­ers, but it stands for Video Graph­ics Ar­ray (yep, so back to “VGA adapter”, thanks IBM).

VGA ce­mented the 640x480 16-colour res­o­lu­tion as the stan­dard, one that’s still used to­day. But it could also do 320x200 with

an as­ton­ish­ing 256 colours. Sounds low-res? Re­mem­ber: every­one was us­ing 14 or 15-inch CRT mon­i­tors. You could see the pix­els but it still looked crisp. Like Minecraft run­ning in 4K.

Be­cause 1987 was when the clone mar­ket for PCs re­ally started to take off, VGA be­came the stan­dard, well, stan­dard for PCs, and 640x480 be­came the first widely im­ple­mented stan­dard for “good” PC graph­ics.


The year 1987 was im­por­tant for PC gam­ing in so many ways. Be­cause it was the year when se­ri­ous num­bers of PC clones started to ap­pear, it also marks the point where IBM threw up its hands and said “Good luck guys!” to the in­dus­try.

From 1987, the fu­ture of PC gam­ing would no longer be de­fined ex­clu­sively by IBM. Sure, In­tel would shoul­der much of the bur­den, but other con­sor­tia and stan­dards or­gan­i­sa­tions be­gan to spring up. Par­tic­u­larly VESA - the Video Elec­tron­ics Stan­dards As­so­ci­a­tion.

As we be­gan to rush head­long into the nutso 90s, hav­ing a “VESA com­pat­i­ble” graph­ics chip made con­fig­ur­ing games like Wing Com­man­der and the var­i­ous Kings Quests so much eas­ier.

VESA cham­pi­oned a stan­dard called, sim­ply, Su­per VGA. Su­per VGA wasn’t a spe­cific kind of adapter, but came even­tu­ally to mean any­thing that built off the VGA stan­dard.

For gamers, SVGA meant a res­o­lu­tion of 800x600, a colour depth of 256, and (at first) fram­er­ates solidly in the teens.

What you’ll no­tice about 640x480 and 800x600 is that both these res­o­lu­tions have an as­pect ra­tio of 4:3. Noth­ing about SVGA was ever set in stone, but mon­i­tor man­u­fac­tur­ers needed to be con­fi­dent their prod­ucts would work with any of the in­creas­ing num­ber of PC brands, and those PC mak­ers could no longer rely on cus­tomers buy­ing a (usu­ally ex­pen­sive) brand-matched mon­i­tor.

PC didn’t have Ap­ple’s lux­ury of con­trol­ling all the hard­ware, top to bot­tom, so Su­per VGA was a com­pli­cated stan­dard that could, ul­ti­mately, sup­port res­o­lu­tions all the way up to 2560x2048. Which you would only ever see on a gi­gan­tic 21-inch CRT mon­i­tor mostly likely built by Sony and owned by your mate’s dad who was into CAD in some way.

Of course the maths-wonks among you will re­alise that 2560x2048 is still 4:3. But the mon­i­tor you play your games on to­day isn’t 4:3. So what’s up with that?


The short ver­sion is that TV man­u­fac­tur­ers looked at all the work done by PC en­gi­neers, looked at new-fan­gled LCD TFT tech, and said thanks very much we’ll take it from here. At least, stan­dards-wise.

Ac­tu­ally it’s a lit­tle fid­dlier than that. There are te­dious rea­sons why a cath­ode ray tube mon­i­tor is much much cheaper to build in a squar­ish shape than a rec­tan­gu­lar shape, but this didn’t ap­ply, in the same way, to LCD.

Like al­most ev­ery­thing, liq­uid crys­tal dis­plays got their big break on PCs. I re­mem­ber re­view­ing a 19-inch 1600x1200 LCD in 2000. I for­get who built it (prob­a­bly Philips, it was a weird time) but I’ll never for­get the RRP: $9,999. And ghost­ing? Man, that thing ghosted like Para­nor­mal Ac­tiv­ity XI: Ami­tyville.

LCD man­u­fac­tur­ing is all about mak­ing a gi­ant thin film tran­sis­tor sheet and cut­ting it up to give you the most pan­els as pos­si­ble. At some point, some­one fig­ured out you could take a 19-inch 1280x1024 dis­play and lop off the bot­tom to make it a 1280x720 17-inch widescreen dis­play.

Don’t mar­ket it as a cut-down 19-inch though. Mar­ket it as an ex­tended, wide 17-inch dis­play! You get more pan­els from the TFT sheet, and you can charge more be­cause it’s widescreen!

This worked bril­liantly right up un­til the re­lease of Bioshock, when PC gamers on widescreen mon­i­tors play­ing next to gamers on 4:3 mon­i­tors re­alised - to their hor­ror - that the old-school 4:3 guy could see more of the en­vi­ron­ment! Bioshock wasn’t ex­tend­ing the game to the left and right! It was lop­ping it from the top and bot­tom! Cue out­rage!

Me, I say at least it was still run­ning at 16:10. Con­sumer elec­tron­ics had yet to de­liver the fi­nal in­sult, the un­kind­est cut of all.

But un­kindly cut it even­tu­ally did. The first truly af­ford­able mass-mar­ket widescreen “Full HD” gam­ing mon­i­tors, ap­pear­ing around 2009, didn’t run at 1920x1200. A 16:10 1920x1200 dis­play was ex­pen­sive. I’m not ashamed to say I spent $1200 on a Dell once. That screen lasted more than a decade, and I think I played my role in help­ing push prices down.

But I do feel bad that, even as ed­i­tor of this mighty mag­a­zine (well, ex-ed­i­tor at the time, it was about 2005), I was un­able to stop the rot, to pre­vent the in­sult that is 16:9.

See, those first full HD mon­i­tors were 1920x1080. That’s the same res­o­lu­tion as a [spits] tele­vi­sion. We were robbed of 120 ver­ti­cal lines of res­o­lu­tion by dig­i­tal pro­gram­ming. By 7Mate.

That’s right. Af­ter nearly 30 years we’ve come full cir­cle. From us­ing weird RF adap­tors to make our TVs work as com­puter mon­i­tors, we now have com­puter mon­i­tors that are ba­si­cally fancy (and tiny) TVs.

I guess 4K makes up for it. Just not very much, is all.

You may scoff, but the Switch has ben­e­fit­ted from decades of PC evo­lu­tion.

This is your video card’s great, great, great... grand­par­ent. Bioshock showed gamers ex­actly what they were miss­ing out on ...

Newspapers in English

Newspapers from Australia

© PressReader. All rights reserved.