Yes G-Sync was first (by a week or so).
I would guess AMD came later due them handing it in to VESA before going public with it.
Yes G-Sync was first (by a week or so).
That is how I remember it as well.
I thought gsync came first but I didnt know by how much. I guess I dont understand why they tried to ride that pony for so long. Perhaps they knew they couldnt keep getting away with it but the red tape stopped it from happening sooner.
It (g-sync) would have been first to actually be available to use and functional yes.
I still have no idea what any kind of adaptive sync looks/feels like.
Neither do I. Lol. I just really abhor nVidia as a company with their tactics. Even just thinking about it makes me feel like I have done something wrong and am actively holding back developments.
Edit by @Zavar - Please reply to this comment in this topic
It is not a bad decision marketing wise. Put an imaginary badge on it, price it higher, maybe put “special hardware”, if it is more expensive, it must be better, right? RIGHT?
IIRC G-Sync lags one frame behind due to buffering it in the “module”
Imagine dipping below 60, maybe even below 40 and it still does not feel choppy, vertical lines do not get offsets in them, etc.
It smooths out the ride.
To me, the absence of FreeSync is more noticeable than high FPS.
It’s reverse vsync. The monitor is working with the graphics cards speed instead of the GPU working with the monitor speed.
I have been drooling over a freesync monitors for a while now and I still don’t have one and I game just fine, but people say it’s really nice…
You know how it feels butter smooth when you have a game locked at 60fps? VRR feels like that at 41fps too, and 37fps, and 55fps.
well 60fps feels like shit for me now, so I guess are there diminishing returns at higher refresh rates?
Not with tearing, I noticed 15 sec into a game when a driver update had turned off adaptive sync.
I personally will never go back to not having it.
Yes, it eliminates tearing even at high refresh rates.
More importantly though, is that it feels consistent. Without VRR on a 144Hz gaming monitor, if your framerates vary from 65fps to 114fps, even though they stayed above 60fps the entire time, the wildly variant frametimes don’t feel perfectly smooth. VRR fixes that.
I wouldn’t say it’s transformative if you have a high-end GPU and get over 60fps anyway. It’s most impressive when your rig is struggling at <60fps. But it does make a difference at the enthusiast level too.
So here is the question… what about GSync monitor with AMD GPU, how well does that work? Potentially down the road if I can chuck my 1080TI for an AMD product I will… and really would prefer not getting another monitor.
It works fine as a monitor, but you won’t get any variable refresh rates, no.
There seems to be some confusion still on this topic, and it’s no surprise with how Nvidia is going about it, but here’s the deal;
Any Vesa adaptive sync monitor(almost always branded freesync) will work with pascal and newer Nvidia GPUs.
The g-sync module monitors will never work(VRR) with anything but Nvidia GPUs.
The g-sync certified monitors will be the Vesa adaptive sync monitors Nvidia deems good enough to carry the g-sync branding.
Thats what I kinda figured… oh well, at least I can resell it.
NVidia is going to obfuscate and confuse as much of the public as they fucking can with this.
I look at this as a completely positive and long-awaited move.
I welcome the “g-sync certification” because Nvidia has a point; many cheaper freesync monitors had very narrow refresh ranges that limited their utility while every gsync monitor worked great.
The confusion will be long forgotten when (in like 6 months) nobody is selling monitors with hardware gsync modules any more, and everybody knows to look for that “g-sync certified” seal even if they’re running an AMD GPU.
TESTED: It seems to work, but maybe I’m wrong?
System: i7-4790K, 16GB RAM, GTX1070, HP Omen 32 QHD @75Hz Freesync monitor. I can run most games at 2560x1440 Ultra pegged at 75 FPS V-sync.
I tried Just Cause 3, Forza Horzon 4 and 3DMark Firestrike.
I turned Adaptive Sync on and off in the Nvidia control panel.
Game settings had V-sync off.
Just Cause and Forza stayed locked at 75 FPS as if vsync was on. The Forza benchmark showed I was rendering between 85-120 FPS, but the monitor on screen display was locked at 75 FPS.
3DMark looked good (no visible tearing) but exposed the downside of Freesync = Lack of range? With A-Sync off I was getting up to 90 FPS with noticeable screen tearing when it slowed down. The tearing was worst in the final demanding combined test. Overall score was more with A-sync off.
With A-sync on I saw no tearing until the final test and the monitor varied between 50-75 FPS. Max was capped at 75 FPS. In the combined test I sometimes got framerates below 30FPS and I could see some tearing. I guess 30 FPS is too low for a Freesync monitor to adapt to?
My test was by no means comprehensive nor did I repeat tests.
I just wanted to see
edit: HP Omen 32 Monitor specs
The VRR tech only operates from 48Hz to 75Hz
That is very worrying, effectively making everyone believe that their AMD card is doing great things because of nVidia tech… Which is an outright lie because it is Freesync all along but nVidia are co-opting the branding.
This is possibly the worst impact of this. Their tech failed to make a significant impact so they are back to stealing others and calling it their own warping the perception of all buyers here after.
Edit: now I wonder if the nVidia certification will include a clause that should it be awarded the badge you are not allowed the Freesync branding as G-Sync would supersede it in their warped world of lies and cheating.