Adaptive-Sync On NVIDIA GeForce First Look - Good Start, Not Quite FreeSync

👤by Tim Harmer Comments 📅15.01.2019 22:56:53



NVIDIA are no strangers to the VESA Adaptive Sync standard. Before launching G-SYNC in 2014 they were involved in developing the standard alongside other industry luminaries including AMD. It's only when it became clear that the final VESA standard would not fulfil their needs that they threw all their considerable weight behind their proprietary implementation of Variable Refresh Rate technology. AMD meanwhile pursued Adaptive Sync as a solution, with FreeSync being the result.

It's been a long five years since the first G-SYNC monitor was unveiled. Both the number of models supporting G-SYNC and its technological complexity have grown broadly, and this year saw the introduction of 4K 144Hz HDR 'G-SYNC Ultimate' Panels. But even twenty premium grade gaming monitors pales alongside over five hundred Adaptive Sync monitors making it to the consumer mass market.

This past year even TV and games consoles manufacturers announced support for Adaptive Sync; plus chipmaking giant Intel is likely to roll in compliance across their GPUs in the not too distant future. Nonetheless, even with pressure building, few would have bet on NVIDIA CES Jensen Huang announcing marginal support of the technology at CES this year.

Today NVIDIA rolled out the GeForce Game Ready 417.71 WHQL drivers, and in so doing unlocked provisional Adaptive Sync support on suitable monitors. Now that the drivers are available we've had a very brief chance to try out this new NVIDIA feature, and wanted to relay our initial thoughts on the feature that they're calling 'G-SYNC Compatibility'.

Here's what you'll need to get started:

1. A PC with NVIDIA GPU based on the Pascal, Turing or Volta architecture. This encompassed almost all of the GeForce GTX 10-series and GeForce RTX 20-series of cards, but does leave out two of the most popular cards currently in the hands of PC gamers: the GTX 960 and 970.

i) The graphics card must have a DisplayPort output. That covers the vast majority of NVIDIA graphics cards, but clarification is necessary.

2. A monitor that supports the VESA Adaptive Sync variable refresh rate standard over DisplayPort.

i) Ideally, the monitor will be one of the twelve listed below which are officially 'G-SYNC Compatible'.

ii) Although some monitors support FreeSync over HDMI, similar modes are not yet available to NVIDIA GPUs.


G-SYNC Compatible monitors have been tested by NVIDIA and deemed to offer a good enough experience when gaming for them earn the 'G-SYNC' moniker, even if they don't have the integrated G-SYNC module. Of over 400 Adaptive Sync models tested only 12 satisfied this requirement, almost all of them being 'gaming grade' monitors that have also met AMD looser FreeSync standards.

Setting up.

It will come as little surprise that setting up G-SYNC compatibility on your Adaptive Sync DisplayPort monitor is nigh on trivial. To start, simply download and install the GeForce Game Ready 417.71 WHQL drivers either directly or through GeForce Experience.


Once installed, if you have a G-Sync Compatible Display connected, a Windows Notification message will pop up.


Navigating to the NVIDIA control panel will show up the following bare-bones option page.

Anyone with an Adaptive Sync monitor connected via DisplayPort will be able to toggle the setting to enable support. Those with G-SYNC Compatible monitors will have it enabled by default, and also have the option of using display-specific presets determined by NVIDIA as ideal.



Finally, double-check that Windows Display Settings properly identifies the monitor's maximum refresh rate.

A quick hands on.



Unfortunately I only had one Adaptive Sync monitor to hand for a brief hands on this morning: Acer's XG270HU. You may remember it from the initial round of FreeSync launch models in March of 2015, and as luck would have it NVIDIA class it as one of their twelve G-SYNC Compatible monitors. As such, it was immediately recognised and configured to NVIDIA's presets without hassle.

The XG270HU is a 144Hz 1440p TN panel with an exceptionally wide Adaptive-Sync refresh rate range: 30Hz to 144Hz. Although not as wide as true G-SYNC panels, many of which advertise a 1Hz minimum, it is wider than the majority of its competitors despite an early vintage. It does not, so far as we're aware, support Adaptive-Sync modes over either HDMI or DVI; however as NVIDIA currently restrict functionality to DisplayPort only, that's hardly a hardship.

In terms of NVIDIA GPU hardware I am limited to NVIDIA's GeForce GTX 1060 6GB, GTX 970 and GTX 660. Thankfully Pascal is supported, but the older Maxwell and Kepler cards aren't up to it just yet.

With limited time, I wanted to experience gaming at two ends of the spectrum. Obviously we're a little horsepower-hampered here for a 1440p display so we've travelled back to Borderlands 2 to meet the upper echelons of 1440p@144fps gaming, and The Witcher 3 for something closer to the vicinity of 40fps.

Initial Impressions:

Despite NVIDIA's rather bare-bones implementation, playing with G-SYNC compatibility mode enabled generally felt as natural and fluid as it does with an AMD GPU utilising their FreeSync implementation, so long as it stayed within the XG270HU's Variable Refresh Rate window.

NVIDIA are worried about specific issues with some panels including blanking (black frames when the monitor and GPU de-sync due to being out of its Adaptive Sync range) and pulsing (the backlight refreshing out of sync with frames), but that was not perceptible in this case. Clearly NVIDIA's testing team were on the ball in this instance.

However it's not all sweetness and light. I observed frame stuttering if the frame rates dropped below the monitor's minimum, and it was pronounced. This was exceptionally clear in The Witcher 3, which incorporates many in-game cutscenes that appear to be limited to 28fps (according to FRAPs readings) at quality settings suitable for nominally 45fps play. The difference between ~45fps gameplay and 28fps scenes was like night and day.

This highlights what might be NVIDIA's lack of Low Framerate Compensation (LFC) under G-SYNC Compatibility mode, a feature that AMD included early on in FreeSync's lifespan. When outputting frames below the minimum refresh rate AMD's LFC repeats frames, simulating the low frame rate while effectively outputting at double the rendered frame rate. It's not perfect, but it offers a markedly better experience than effectively enforcing V-SYNC at 30Hz when below the minimum.

True G-SYNC models, with a minimum refresh rate of 1Hz, do not need this feature, but its something that NVIDIA could do with implementing post-haste. It would only be suitable on monitors with maximum refresh rate a little more than twice that of its minimum, but that accounts for all twelve of the G-Sync Compatible Monitors.

As for monitors with narrow Adaptive Sync ranges or other criteria which NVIDIA find unacceptable, they are beyond our ability to test directly. However it does seem reasonable to infer that unless you can effectively limit the top end of output frame rates, and never pass below the lower extent of the range, the experience you'll have will be broadly unacceptable too. Gaming Monitors with refresh rates from at most 44Hz to at least 100Hz should be okay, when paired with a suitable GPU. And that's setting aside other issues identified by NVIDIA.

Nonetheless, whatever your current model of Adaptive Sync monitor, you are now able to test these features for yourself rather than being locked out. Let's not look a gift horse in the mouth here, this is easily a net gain for consumers over the prior status quo.

In summary: a good first crack of the whip at a feature it's likely not many at NVIDIA really wanted to support. Improvements are still to be made before it can claim to be an implementation equal in scope and outcome to FreeSync, but there's now a clear pathway to success. Furthermore, pushing compatibility back to older generation architectures would be an excellent follow-up gesture by the Green Team. By the same token, it's no threat to the latest generation of true G-SYNC panels.

Wait for a little more independent testing to wrinkle out any critical flaws and then give G-SYNC Compatibility a try for yourself. It could be enlightning, especially if you have a supported gaming monitor.

Current G-SYNC Compatible Monitors & Specs





Related Stories

Recent Stories

« Akasa Presents Vegas RAM Mate Addressable RGB RAM LED Kit · Adaptive-Sync On NVIDIA GeForce First Look - Good Start, Not Quite FreeSync · Tech Round Up – 15-01-2019 »