Return to

NVIDIA To Officially Support VESA Adaptive Sync (FreeSync) Under “G-Sync Compatible” Branding



Back when I bought my G-Sync monitor, I was fully aware I was bending over for the nvidia tax. It was my only option if I wanted adaptive sync and AMD didn’t have anything competitive with the performance of my nvidia card at the time so nvidia was my only real choice for the level of gaming experience I wanted, and I could afford it so I begrudgingly paid the tax. I’m not really upset by this move, I haven’t lost anything I didn’t have yesterday, though I am disappointed that I am still locked into Nvidia cards going forward if I want to keep using this monitor.

This is really a great thing for the industry as a whole and I’m very happy for all those who have nvidia cards who couldn’t afford a g-sync monitor, will now be able to have more affordable adaptive sync available to them. Consumers in general can confidently buy a monitor they know will last them for years to come and they won’t be locked into one single brand of GPU.

This really is something that shouldn’t have taken as long as it did for nvidia to implement so I’m not really going to heap praise on Nvidia for the move other than I’m happy they eventually did the right thing.

The consumer really won this one.


I own an Nvidia graphics card and an adaptive sync monitor. How will I know if it’s working/turned on? I never expected to be able to make use of this feature.


Ask @MazeFrame he has been playing around with it.


@Zibob called
Most monitors (3 out of 3 freesync screens I had my hands on :wink: ) have FreeSync off by default. So you have to dive into the onscreen settings and enable that on the monitor side.

LG "where to look"
  1. Menu > Picture
  2. Game Adjust > Freesync
  3. set to “ON”

It is less hidden on Samsung Monitors

Then there is probably something in the driver controll panel to turn on.
On AMD, it is under “display”, can´t check for Nvidia.


Gotta kill AMD’s Freesync branding somehow, and claim victory with “G-Sync Compatible”!


My two cents is this. Nvidia hardware just doesn’t support freesync open standard. Freesync works or we would hear more about freesync not working on amd cards. I think the gsync hardware might not play well because nvidia wants to sell their gsync hdr displays.


Nah, the reasons Nvidia gave are “legitimate” excuses.

But still bullshit.

  • who cares if Freesync needs to be turned on manually? That’s literally a driver tweak - pop up a box asking if you want it on?
  • sure, some cheap freesync monitors have limtied freesync range and maybe issues (that as i understand it mostly got fixed). limited freesync range is still better than fixed sync.

Nvidia are just not wanting G-Sync branding to not be in people’s mind, and thus be able to claim “G-Sync” (even if it is implemented as Freesync) as an NV exclusive feature.

It’s bullshit anticompetitive behaviour.


If anyone prefers to read, the article is here:

(FreeSync) was never proven to work. As you know, we invented the area of adaptive sync.

This is utterly ridiculous, since Adaptive Sync dates back to the 2009 spec for eDP (embedded DisplayPort), and G-Sync was announced in 2013. Unless they convinced VESA to add it to the spec, I don’t see how they can claim to have invented it.

At best, maybe they could claim Low Framerate Compensation as invention?


Adaptive-Sync has been a part of VESA’s embedded DisplayPort, eDP, spec since 2009 and as a result, a lot of adaptive-sync technology is already incorporated into a lot of the components for displays that rely on eDP for internal signaling.


Oh easily, like this:

And then the fans eat it up and parrot it back because nVidia is better, before you know it popular memory recalls nVidia being first.

Which would be fine really if the record was set straight but whenever someone tries in any legit manner that gains traction they will spin it again.


Anyone know if the Maxwell series will see any of this love?

I’m eyeing off a couple 2k 144hz freesync monitors.
Having FreeSync would just sweeten the deal!


I would be surprised if Nvidia still supports Maxwell in drivers. So my estimation is no.


Even the 600-series is still getting drivers, but I doubt anything older than the 10-series will get official Freesync support.


It needs DP 1.2a and HDMI 1.4 if I’m not mistaken, so you are probably correct… The 900 series don’t support DP 1.2a and HDMI freesync came out during 10 series, so yeah…


I haven’t seen testing on it yet obviously, but my assumption is that the lack of low framerate compensation, i.e. repeating frames at low framerates, is why some freesync monitors exhibit flashing sometimes on Nvidia GPUs. That flashing renders it unusable.

Real hardware g-sync doesn’t need to repeat frames from the GPU side, because the g-sync module keeps a copy of the last frame buffered and just repeats it when necessary on the monitor side. So Nvidia approached the problem differently to start.

As for some other artifacts we’ve seen from CES, blurring when looking down the scope of a gun, fast motion artifacts, that stuff probably is the monitor scaler’s fault, and Nvidia is correct there. Just my educated guess.

That’s not to say those things couldn’t be addressed at the driver/GPU level too. AMD obviously did so.


Woo Hoo! My Freesync monitor is recognized as Gsync

G-Sync Compatible Korean ( & Pixio) Monitors Testing

Dang it. So no love for the 980Ti :frowning:


900 series Nvidia was full on proprietary mode…


How does it work at low framerates? Try running DSR to get your fps below 48.


It is worked great in the Nvidia Pendulum + Test pattern demo. Vsync was jerky, No sync was tearing and Freesync was silky smooth.

My monitor spec only has Freesync active from 46Hz to 75Hz. Below 46Hz tearing is noticeable. Not a problem for me. My system is balanced to run most games at 1440p around 75FPS. Freesync seems to just smooth it out when the framerate drops for a few seconds.


It has just occurred to me.

The demo nVidia was showing off with the black flickering is supposed to be lack of low frame rate compensation. Is there any nVidia card from the last 10 series or the new 20 that does not make the game run at 60+ fps especially when standing in one spot like they showed and just looking around.

I can’t call shenanigans because I just don’t know but thinking about it it seems very odd. The area was plain, nothing physics wise happening, no gunfire or running around, just looking.

I wonder if they deliberately ran it on the likes of a 1050 to make that happen. Did any youtuber there actually scheck the system specs that was running the demo? I know a few took a look at the panel and maker.