Return to

NVIDIA To Officially Support VESA Adaptive Sync (FreeSync) Under “G-Sync Compatible” Branding



Again, it will not go back…
Nvidia refused to implement Display Port 1.2a standard back in the 900 series, so no freesync using that. HDMI freesync came much later, so 900 series should not be compatible.
I will be very very surprised, if 900 series have freesync support, but AMD 7000 (and rebrands) don’t…


To my knowledge maxwell launched with DP1.2 and not DP1.2A which is required for vesa adaptive sync.

It should be firmware updateable to DP1.3 which might make it work.

It’s still an optional part of DP in 1.3, so it depends on nvidia, but it’s worth a try.


As per AMD, these are the supported ones:


Tested my Acer XR342CK Ultrawide which comes with adaptive sync. I downloaded the Nvidia pendulum test and it flickered and looked worse than normal.


According to Techspot, Nvidia does support LFC. So the black flickering was, in fact, just due to Nvidia purposefully demoing on a shitty monitor.

They say that if a monitor works on AMD with VRR, it will work on Nvidia too. Nvidia used their whole ass when they implemented this long-overdue feature.


So as per the same article nVidia were using defective monitors.


Yeah, they used perhaps a quarter of an ass in the CES demo. It was, it seems, purposefully deceptive.


I still very much want that demo independently tested. Panel or not I don’t trust marketeers at all when it comes to slandering a competitor.


mine hasn’t been working I can fiddle with it again later.


Nvidia would never … oh … right.


Nvidias marketing is working. It is not “Nvidia could not figure out how to use FreeSync, an open standart.” instead it is “Nvidia is saving the world”


As we all know they invented variable refresh rate…
They aren’t even try to hide it… It’s blatantly wrong…


That marketing would be working if people were parroting it, which they aren’t. Every news article says the same thing, “Nvidia finally breaks down and supports freesync like everybody knew they eventually would.”


Everyone keeps portraying the “G-Sync compatible” BRAND as some formal certificate when it is just marketing.


It isn’t, Nvidia is actively certifying monitors. Looks like monitors with wide VRR ranges and high refresh rates so LFC works. If your monitor caps at 60Hz, the VRR range starts at 48Hz, and a videogame is running at 43fps, the monitor needs to run at 86Hz for LFC to work. It can’t so you get tearing or stuttering, depending on vsync or not.


Ultimately it will mean prices of freesync monitors will go up

all for the sake of a logo so that monitor manufacturers can appeal to nvidia card owners as g-sync compatible.

This isn’t adoption, it’s absorption.


It’s unclear if they charge for the logo. Maybe? And if so, it’ll certainly be a lot less than the $200 to $500 the gsync hardware module cost, and it will lead to VRR monitors that are actually useful in most scenarios. A monitor with a VRR range of 48 to 60fps isn’t super awesome.

Beyond the logo, it will push up the bottom quality levels, rising tide helps everybody. Cheaper VRR monitors, even if uncertified by Nvidia, will have wider VRR ranges so people don’t call them garbage. This is a good thing.


Why wouldn’t they?
They charge 10$ to give motherboards SLI certificate… Not 10$ a model - 10$ a motherboard.
Asrock manufactures X370 Killer SLI in 1000 units - Nvidia made 10 000$ out of that alone…
You think they won’t charge for G-Sync compatible sticker?


I made sure mine were the widest range I could get (44 to 144) for the best experience :smiley:

They are awesome

I wanted 30 but at the time the only models that did it were almost as expensive as sodding gsync monitors!

ultimately… as long as I stay above 44 and cap at 144 things are golden.


That is not how LFC works.
LFC as AMD implemented it fakes a “half frame” or repeats the image. The refresh intervall at that point is 16ms as the GPU still has the previous frame buffered (Nvidia does that in the module).

As a daily FreeSync user, it is very useable (fallen for the Ngreedia lies?)

Previously it was 60 or get lost, now a range of 12FPS is unacceptable? And when 60FPS is exceeded, it also does not matter as only whole frames are drawn?

Fact of the matter is: Nvidia is spinning themself as saviour of tech (wich it isn´t) and Ngreedias stock price will take a little step for each printed sticker they slap on something that was free before.