G-Sync Compatible Explained

G-Sync Compatibility. What It Is and How Nvidia Is Doing It.

G-Sync Compatible

Jensen Huang, CEO and co-founder at Nvidia had a lot to say at CES and following in a press conference. One of the most interesting topics that he covered was Adaptive Sync. For those that aren't up to speed on what Adaptive Sync, G-Sync, and Freesync are, I'll break it down for you.

 

  • Adaptive Sync - A standard created by VESA that enabled variable refresh rates on displays by utilizing the vblank interval that was present in the eDP (embedded DisplayPort) standard that was found in laptops. Adaptive Sync is the basis of which Freesync operates. The feature was added to the DisplayPort 1.2a standard in December of 2014
  • G-Sync - A Variable Refresh Rate method that is built into monitors via a module provided by Nvidia. Similarly to Adaptive Sync, G-Sync utilizes the vblank intervals on the display to align the display of frames with the refresh rate of the monitor. G-Sync requires a stringent certification process that goes over 300+ items to help ensure quality, and Nvidia is involved throughout the entire process of the monitors design and supply. G-Sync was announced in October of 2013 with monitors coming the following year
  • Freesync - Freesync is the standard that AMD created to supplement the Adaptive Sync standard. It covers the software side on AMDs part that allows the GPU to communicate with the monitor, as well as a very loose qualifying specification (we'll get to this). Included within the Freesync software portion is tuning profiles for monitors as well as LFC (Low Framerate Compensation). Freesync was initially announced in March of 2015

As you can imagine, Nvidia having their own standard (which came first) and AMD being the only adopter of the VESA standard has been a major dividing point in the GPU community. Since the inception (and until tomorrow) having an Nvidia GPU meant you had to have a GSync monitor, and AMD a Freesync, if you wanted VRR technology. This can become a problem when considering upgrades though, as your monitor was now tied to a specific brand.

When Adaptive Sync was announced, everyone wondered when and if Nvidia would adopt the standard, because lets face it, it IS a standard. Product release after product release we were left wanting, as Nvidia seemed to have no interest in opening up their ecosystem. That all changed at CES 2019.

Starting on January 15th, Nvidia will start supporting Adaptive Sync (branded Freesync at this time) going forward. But its not all rainbows and butterflies, there is a catch.

Nvidia has a very stiff certification process for their G-Sync monitors. There are over 300 different measures in their testing process that have to be met before a monitor can be certified as G-Sync. I asked for a list of those items, but sadly they aren't open to sharing them. This differs greatly from the AMD side of the house, where Freesync has almost no real qualification standards outside of being Adaptive-Sync compatible. This has been a major drawback of the Freesync brand compared to G-Sync, as you never knew without explicit research what quality of monitor and VRR you were going to get.

Because Nvidia doesn't have direct control over the manufacturing of Adaptive Sync monitors, they have come up with their own certification process for G-Sync compatible. They are as follows

  1. Sufficient VRR Range - Not all Adaptive Sync monitors have a very effective VRR range, and this was abundantly true with early released monitors. Some with only a 16 fps range, which makes the feature very unreliable. Nvidia requires a range that is greater or equal to 2.4:1 (i.e 60 - 144 Hz, 30 - 75 Hz). This serves two purposes: to allow for reliable drive-side frame repetition in difficult circumstances (like very high or very low jittery frame times) and to increase the chances that your games performance is within the supported range. This range enables Dynamic Frame Repetition (or "LFC" in Freesync terms)
  2. VRR is enabled in the OSD by default - This is one of those 'save the consumer from themselves' kind of thing. On many Adaptive Sync capable monitors, this feature is disabled by default and sometimes buried in a heap of menus. In order to pass this test, the panel must enable VRR when a compatible GPU is plugged in. This prevents the user from buying a VRR capable monitor and not enabling the feature because they don't understand it has to be turned on to work.
  3. No bothersome flickering, overdrive artifacts, luminance/CCT drift, or aberrant behaviors while VRR is active - An issue that you may find with Freesync panels on the market today is flickering when within the VRR range. This isn't a problem with the Adaptive Sync standard or the Freesync software, but more issues with the scalar or panel itself in the way it behaves when VRR is in play.
  4. No bothersome flickering, overdrive artifacts, luminance/CCT drift, or aberrant behaviors at upper/lower VRR boundaries - As above, Adaptive Sync monitors must not exhibit these problems when operating above or below their VRR ranges. Again an issue with panels and the lack of certification process for Adaptive Sync / Freesync.

Of course, with standards like that not all the monitors could make it right? Well, that is right. In fact, only 12 monitors out of the 400 tested (at the time), passed the certification. Now, I cannot stress this enough, even if these monitors didn't pass certification, THEY WILL STILL WORK on your Nvidia GPU (at least in most cases, though I haven't heard of a case where they don't work as well as they did on AMD). This may be contrary to what you hear or read elsewhere, but this is the truth of it (even if Jensen says Freesync doesn't work). If you have a freesync monitor that didn't make the list and you want to continue to use it, you simply need to go into the control panel and enable it manually.

What does this mean for the PC builder community? Well essentially, if you have a Freesync display, you can now purchase an Nvidia GPU on your next upgrade and keep your adaptive sync capabilities. On the flip side, Nvidia did not open up their ecosystem on the flip side, so AMD GPUs will not work with G-Sync monitors. While this may appear to be a nice move by Nvidia, do not be mistaken, they are simply increasing the amount of people that they can market to and putting their products back on the table for those who opted for Freesync instead.

Now, with anything related to Nvidia / Intel / AMD lately there is always controversy, some underlying scheme being brewed and conspiracies that pop up all over reddit, wccftech and various tech tubers (we love you guys though). So, in an effort to get some clarity on it I reached out to Nvidia to get a glimpse of what there plans were and how this all worked. So here are some points that I got (If you have any other questions, let me know and I'll ask. They are very responsive).

G-Sync Compatible Branding - Some have suspected Nvidia of wanting to strip AMD of their Freesync brand, and start having monitors called Adaptive Sync instead of Freesync. In addition to that, they are suspecting that Nvidia is charging certification fees and branding costs for slapping 'G-Sync compatible' on their monitors. As for the first part, the monitors are at their base Adaptive Sync monitors, that is the standard. Freesync is AMDs standard/software for Adaptive Sync displays. Of course Nvidia isn't going to call them by AMDs standard, but instead the base standard. Welcome to marketing. Secondly, and the most important part for me. There is no marketing campaign for G-Sync compatible. Nvidia so far has purchased all tested monitors out of pocket at retail (no cherry picked monitors from manufacturers) and tested on their own. They are continuing to do this until all monitors are tested. Also, there is no logo campaign at this time nor is there one planned. Going forward, this may very well change, but I don't believe (personal opinion) that Nvidia is going to have the sway to swing a market that is already established without the brand. It doesn't make sense to me for a manufacturer to do that. After Nvidia is done certifying all existing panels, there may be a fee to test new panels, but I find this fair. Again, a personal opinion, but I have found that the Freesync 'standard' to be very lacking and customers may not be buying what they think they are buying with cheaper panels. They should be just 'Adaptive Sync' panels and left at that. But again, thats just my opinion.

Not all Panels Will Work - No, thats incorrect, all Adaptive Sync panels will 'work' as they are built. From what we are told, there is no difference in behavior between different manufacturer GPUs. Being G-Sync compatible simply means that the standards set above have been met and it will work by default when you plug it into your computer.

No "LFC" or Dynamic Frame Repetition - Again, this is incorrect. One of  the primary reasons behind the >= 2.4:1 VRR window is to make sure that the driver and monitor can do repetition at a scale that works with the range of the VRR, duplicating frames in the window of vblanks to get a stutter free experience.

No VRR over HDMI  - This one is true actually. Currently Nvidia is focusing on getting current monitors tested and certified and have no immediate plans of getting VRR over HDMI to work. This could change in the future, or it may simply not work with Nvidias HDMI implementation (though I have no evidence of this at all, just a possibility). We're going to have to wait and see where this goes.

Pascal and Turing Support - Pretty self explanatory. There is no plans currently to reach back to Maxwell, but could change in the future.

In addition to all of this, Nvidia has announced the G-Sync ultimate brand. That wasn't the point of this article, but I will mention it here for the sake of completion. In addition to the standard G-Sync certification, Nvidia's G-Sync Ultimate adds on the requirement for 1000 nit based HDR ensuring a high standard HDR experience.

After all this is said and done, G-Sync vs Freesync ultimately turns into a quality control guarantee difference. This isn't to say that all G-Sync panels are perfect, there are panels out there that exhibit ghosting for instance (this is a draw back of the panel type, not necessarily because its a poor panel). With G-Sync, you are going to get monitors that have been checked and certified for being top quality for what they are (TN/IPS/VA all have their advantages and weaknesses) where with Freesync and even Freesync 2, you're getting a guarantee that they meet a certain specification, but that doesn't necessarily mean you're getting a good monitor. This is not to say one is better than the other. You could argue one way or the other, but if you are a diligent customer that does his research before hand you can't go wrong with either.

Thats it for this one. Hope this helped to clear things up and brings some light to some otherwise muddy waters. Until next time guys, PC Better.