Tech Advisor

G-Sync vs FreeSync

Imagine games without stuttering or tearing. Two technologi­es promise that.

- Jason Evangelho reports

Variable refresh rate monitor. That jumble of words isn’t rocketing to the top of any ‘sexiest tech phrases’ list anytime soon. Neverthele­ss, it’s game-changing technology that’s just beginning to seep into mainstream awareness and adoption. You may know this technology by another, slightly more memorable pair of names from two companies driving your PC gaming experience: Nvidia’s G-Sync and AMD’s FreeSync.

There are now dozens of G-Sync and FreeSync monitors are available to satisfy a broad range of cravings. You’ll find models priced from £125 to north of £1,000, encompassi­ng 1080p, 4K, and even gorgeous curved 1440p UltraWide displays.

So what’s the big deal? Do you need G-Sync or FreeSync in your life? Does it cost more? Are there any deal-breaking drawbacks? Is this tech restricted to desktop use? Is your current video card compatible? Sit back, grab a cup of tea and let’s tackle these pressing questions.

What’s so special about G-Sync and FreeSync?

Ever since we began manipulati­ng onscreen gaming graphics with a keyboard and mouse, the two crucial pieces of hardware in that equation have been butting heads. Your video card is impatient to push image frames to your monitor. But if your monitor’s refresh rate is fixed at something like 60Hz, that beautiful frame of animation comes along and the monitor isn’t ready for it. You only see part of what’s happening: a portion of the current frame, and a portion of the next frame. It looks as if the picture were trying to split itself in two and take off in different directions, and it only worsens the more dynamic your game’s framerate becomes.

Another name for this is screen tearing, an ugly artifact that’s become something PC gamers grudgingly accept as reality. But it’s more than an annoyance – it’s the difference between in-game life and death. Say you’re playing Battlefiel­d 4 and a sniper camping on a mountain peak takes aim at you. The glint of his scope against the sunlight would give him away, except you didn’t see it because it took place on that fragment of a frame your monitor rejected. Yes, it’s an extreme case, but it punctuates the very real problem.

The existing workaround is the V-Sync setting on your graphics card. Sadly, in solving one problem this introduces another – a scenario where your monitor is calling the shots. Now when your GPU is ready to deliver that frame, the monitor says: “Wait a few more millisecon­ds! This silly gamer doesn’t want screen tearing”. With V-Sync on, this manifests itself as ‘stutter’, or seeing the animation last a touch longer than it’s supposed to. It can be a little jarring, and make the game you’re playing feel sluggish.

Ready for yet another symptom of V-Sync? The dreaded ‘lag’. Let’s go back to Battlefiel­d 4 and imagine you just pulled the trigger during a gunfight. Guess what happens if you do it right before the monitor ‘accepts’ the correspond­ing onscreen visual? That precious bullet doesn’t fire at the exact millisecon­d you need it to.

G-Sync and FreeSync elegantly eradicate these problems by giving your video card complete control over the display. If your game is bouncing between 40- and 75 frames per second (fps), for example, then your monitor is going to follow suit, its refresh rate constantly changing to keep pace with the video card. Screen tearing, stutter, and input lag all go away.

What are the difference­s between G-Sync and FreeSync?

Nvidia’s G-Sync deserves credit for being first solution on the scene. Aside from the bragging rights, however, a couple of key difference­s distinguis­h this variable refresh rate technology from AMD’s.

Nvidia invented G-Sync to address both sides of the problem – the GPU and the monitor. Every monitor box emblazoned with a G-Sync logo packs a proprietar­y module. Nvidia understand­ably won’t divulge too many details, but it allows Nvidia to fine-tune the experience based on its characteri­stics like maximum refresh rate, IPS or TN screens, and voltage. Even when your frame rate gets super low or super high, G-Sync can keep your game looking smooth.

Nvidia points to ghosting as a key advantage G-Sync has over AMD’s FreeSync. Its G-Sync module prevents ghosting by customisin­g the way it operates on each and every monitor. With AMD, these adjustment­s are made within the Radeon driver itself, while the display’s firmware is in charge of other parts of the mix. One of Nvidia’s loudest arguments is that AMD may or may not keep pace with those changes on the driver level. With Nvidia’s G-Sync module, because each monitor is physically tweaked and tuned, keeping up with all the panel variations is part of the job.

We have seen ghosting in AMD FreeSync panels like the Acer XG27OHU, but never in a G-Sync monitor, though the ghosting issues in some earlier FreeSync displays have since been corrected via monitor firmware updates. PC Perspectiv­e created this video to compare the ghosting effects in early FreeSync monitors against the Asus ROG Swift, a G-Sync monitor.

AMD based FreeSync on a royalty-free, industry-standard spec known as DisplayPor­t Adaptive-Sync. The indisputab­le fact here is that monitor manufactur­ers don’t need to implement a proprietar­y hardware module, meaning the cost to them is cheaper. Ideally, that savings gets passed on to you, the consumer. In fact, across the board, FreeSync monitors are cheaper.

Let’s take a quick look at two gorgeous FreeSync monitors from Acer. Both of them are sexy, curved UltraWide displays with an IPS panel at 3440x1440 resolution. Both have a 4ms response time, and both include HDMI and DisplayPor­t inputs. They’re nearly identical, except that the XR341CK supports FreeSync and costs £990. The G-Sync version—the Predator X34 – costs £10 more. Granted, it rocks a slightly higher 100Hz refresh rate, but the G-Sync markup is obvious. That’s an expensive example, but it doesn’t hurt AMD’s argument that FreeSync is the more affordable solution.

Is the card in your PC compatible?

As you can imagine given the frequently bitter rivalry between AMD and Nvidia, your GeForce GTX video card won’t support FreeSync, and your Radeon video card won’t give you that buttery smooth experience on a G-Sync monitor. Yes, Nvidia has the option

of adopting the FreeSync/Adaptive-Sync standard – as Intel plans to do one day – but if you’d invested millions into developing a technology to exclusivel­y benefit your users, would you?

Worries about brand lock-in aside, let’s say you want to take the plunge. Is your beloved graphics card compatible? On the Nvidia side the answer is simple: every GeForce GTX card since the 650Ti will do the trick, including every 700 series and 900 series desktop graphics card. With AMD the support is a bit scattersho­t, because some of the company’s offerings are based on older GPUs. For example, the Radeon 360 is FreeSync compatible but the Radeon 370 isn’t. The Radeon 260, 260x, 285, 290, and 290x are ready for FreeSync, but the 270 and 270x aren’t. And so it goes.

Here’s something cool though: a strong handful of AMD’s affordable APUs (an all-in-one CPU and GPU) also support FreeSync, which opens the possibilit­y of building a cheap 1080p gaming box and still getting a smooth gaming experience courtesy of FreeSync.

Wait, can you get this technology on a notebook?

Yes. A handful of G-Sync and FreeSync powered notebooks are on the market right now from Asus and MSI, with more on the way from popular manufactur­ers like Lenovo and Gigabyte. Unlike desktop monitors, notebook displays won’t require that proprietar­y G-Sync module, but to ensure quality, Nvidia is pretty stingy with its approval process. For example, all of the current G-Sync-enabled laptop displays top out at 75Hz, not the standard 60Hz. As for supported mobile GPUs, right now it’s just the 950, 960, 965M, 970M, and 980M, plus all GeForce GTX 10-series laptops. Nvidia is dedicated to the G-Sync cause, so expect to see a proliferat­ion of G-Sync gaming notebooks in the near future.

Are there any deal-breakers to choosing either one?

In our experience, no. Both will greatly improve your gameplay experience.

There is, however, one minor niggle: as things stand right now, G-Sync works only with DisplayPor­t inputs, meaning the vast majority of TVs are locked out of the equation. That’s bad news for people who have an HTPC or perhaps an upcoming Steam Machine in their living rooms over HDMI. AMD, however, now has FreeSync working over HDMI on certain GPUs. Nvidia currently has no announced plans to incorporat­e HDMI into G-Sync.

Closing thoughts

Both FreeSync and G-Sync work exceptiona­lly well at combating the decades-old visual problems plaguing PC gaming. Neither has an exclusive feature compelling enough to warrant switching camps, however. Nvidia has a slight advantage at the very low and very high end of the frame rate spectrum, and its G-Sync does a better job with ghosting, but these are what we’d call edge cases that won’t affect the vast majority of gamers. On the other hand, AMD has a price advantage with comparable FreeSync monitors clocking in at an average of £100 to £150 less.

Whichever you choose, we enthusiast­ically encourage you to jump on the variable refresh rate train, as long as you’re okay with the fact that you’re locking yourself into purchasing graphics cards from the same brand for the life of the display. If you’re rocking Radeon, make FreeSync your next monitor upgrade. If you’re gaming with GeForce, take a good hard look at the crop of G-Sync options. It can’t be stressed enough how dramatical­ly they improve a game’s immersion, and how effectivel­y they eliminate nasty screen tearing, stutter, and input lag, all without introducin­g new problems into the mix.

I’ll go on record saying that if given the choice between a non-G-Sync/FreeSync 4K monitor and a smaller, G-Sync/FreeSync-enabled 1440p monitor, we’ll choose the latter every single time. It’s just that awesome. Find a way to witness it for yourself and you’ll be convinced.

 ??  ?? Another example of screen tearing
Another example of screen tearing
 ??  ?? A G-Sync monitor
A G-Sync monitor
 ??  ??
 ??  ??
 ??  ?? Most – but not all – of the AMD and Nvidia graphics cards released in the previous two- or three years should be compatible with FreeSync or G-Sync
Most – but not all – of the AMD and Nvidia graphics cards released in the previous two- or three years should be compatible with FreeSync or G-Sync
 ??  ?? Acer’s curved, ultra-wide 3440x1440 XR341CKA G-Sync monitor
Acer’s curved, ultra-wide 3440x1440 XR341CKA G-Sync monitor

Newspapers in English

Newspapers from Australia