NVIDIA To Officially Support VESA Adaptive Sync (FreeSync) Under “G-Sync Compatible” Branding

dlss = upscaling without the obvious signs of checkerboarding…

in short, take what amd did originally on consoles and do it better.

Pretty much yeah, no checkerboarding or temporal anti-aliasing artifacts. DLSS is potentially extremely cool technology, and it is Nvidia-specific.

Do you think you could expand on your OP with how you feel about this change and what it means for the industry? Its a little low effort to say “ouch gsync users”.

I wonder what the actual compatibility will look like. Do you guys think there will be an option to force adaptive sync on all freesync monitors or maybe it will be a select few premium models supported with the “G-SYNC compatible”?

1 Like

No need to ask for our opinion, the Anandtech article says Nvidia will allow users to force VRR on non-certified monitors. Nvidia is completely giving up on this fight.

I see I got confused by the part where its enabled by default. I just kinda skimmed the article.

Perhaps you meant well when you posted this. However having the first post be one sentence is too low effort. If you want to do low effort do it in the Lounge.
when making a thread follow this guide :

General Guideline for Making Threads Rules, Guidelines, FAQ

All users are welcomed and encouraged to make threads. Before you make a thread here are some basic general guidelines: The Main Purpose of a thread is to encourage other users to post in response to the creation of your thread. Your Title of Your Thread should be brief and descriptive of the Subject of the Thread. Use Tags The Body of the Thread should being talking about the subject of the thread. If it is video: Share your thoughts on the video and what is the video about. If it is a link to a news story: Share what is the news story about, what do you think about the news story. Invite other members to respond to what you posted. If your thread does not encourage or create the opportunity for discussion/response, then don’t make it.

As for this thread, it is locked due to being to low effort. But will be reopened once OP puts his thoughts into the OP to give the thread a general direction. Right now as it stands it sets up the thread for circle-jerking.

Also, now that this thread is temp locked, please @everyone do not create another thread regarding this topic. All threads that are created will be merged into this one.

1 Like

OP added their thoughts

Thanks for adding some meat to discuss. :+1: :slightly_smiling_face:

I agree. Also, I’m currious both why Nvidia is making this move now and why they have not done so earlier.

Good question. I don’t know either as I don’t use freesync or gsync but I would also like to know. I guess we will see how both the monitor market and consumer decisions will develop in the future.

I bought my gsync monitor a couple years ago, so I’m fine with it. If I purchased recently I’d be irate.

I bought mine about 18 months ago and still miffed… would have loved to keep the same monitor, very much a chunk of change savings.

Careful folks, don’t be bashing on NVidia now, they couldn’t help themselves but have now seen the light, bless their hearts.

1 Like

Does NVidia finally see the writing on the wall for G-Sync? If so it’s about time.

Of course that means AMD is in serious trouble if NVidia supports Freesync. That was one of their biggest advantages in my opinion.

Face palm! I own a 1060 6gb, and a acer 27in 144hz gsync that i got two winters ago. I got it for 200 USD, which is significantly less than what it was retailing for.

Gsync has always been the superior technology. The dedicated hardware means a greater range of frequencies. There are fundamental differences. The only reason it failed is because of the high cost. I don’t agree this is a win. Finding a way to support both technologies at a lower cost would be a win.

Arguing against rtx doesn’t make sense either. Sue its expansive but you should be arguing for competition, not against new developments. Rtx is an amazing development and I look forward to seeing amd’s competitive technologies. New technology is always expensive. But the price should come down.

There’s no particular reason why VRR monitors couldn’t support the same frequency ranges as gsync. I assume that’s what the “gsync approved” monitors do.

As for being superior technologies, they’re basically the same thing. Gsync is in a separate module, and Nvidia mandates tight specs for validation like that refresh range, but otherwise identical.

Gsync also keeps a full FB in memory, but it’s never been entirely clear why that’s advantageous. Plain old VRR works fantastic on AMD GPUs, and I don’t see why it wouldn’t do the same on Nvidia ones.

1 Like

Good. Now all the display vendors need to drop every single monitor with G-sync hardware and then we have two brands for the same open standard.

1 Like

That has never been proven and in case of frequency range, is also untrue.

FreeSync supports 9 to 240Hz, G-Sync 30 to 144


I was just waiting for this. Literally like I said above. Lol.

Why try to find a way to support both cheaply when one is free as part of the VESA standard and then other requires proprietary, expensive and objectively worse technology. This is competition in action like want and Freesync won.

You couldn’t support gsync cheaply, because you were forced to purchase a $200 hardware module from Nvidia. And the module for 4k HDR reportedly cost $500!


From the CES thread.

My wondering is how is that even possible that only 12 out of 400 and potentially 550 pass their certification.

I was right in thinking that adaptive sync is a standard. And there has been very few complaints about them from AMD users.

Could it be that the 12 the approved are the only ones willing to licence the nVidia branding.

1 Like