Subjective Opinions of Gsync

I'm very interested in hearing a gamer's opinion of gsync.  If you've used a gsync panel, will you please share your (subjective) impressions of how this technology affects your gaming experience?

I'm considering purchasing the ROG Swift and would like to know what kind of impact gsync makes.  Most of the reviews seem very positive,  but what do actual users think?

I have not as of yet but hope to as soon as I win the Swift   lol

The theory behind G-Sync (adaptive sync) is nice, G-Sync itself however is dumb as it diverges and is therefore not compatible with the VESA standard for adaptive sync. And because it does not comply with the VESA standard(when there is a standard in place) it should not exist.

I agree with kiaxa. G-sync is a great innovation to monitors however the proprietization of an already standardized (and open) standard is dumb. nvidia is renowned for doing this. AMD's Free sync however seems to keep the standard and this was expected because AMD in general is a very open and collaborative company. This being said buying a monitor with G-sync is not going to make a big difference right now for the fact that games probably will not be taking advantage of it properly or at all for the foreseeable future. The ROG Swift is nice but buying a monitor selectively based on one feature such as G-sync is not smart because Monitors are not graphics cards. What I mean by this is that that monitors are an investment. You probably replace it every so 3-4 maybe even 7 or 10 years vs graphics cards for every new iteration or two. So when I hear people purchased a monitor just because of G-sync(not saying you are) I chuckle. All in all G-SYNC is cool and awesome but it is still very much in its infancy. I would wait to purchase a monitor with it for a couple years because it is not worth any extra cash. get a monitor that fits you. If you want G-sync go for it but its not really a huge game changer right now


I should add that if you have a graphics card or plan on getting on that has G-sync then you should most certainly buy a monitor that support it so you can take full advantage of the card. If you don't save your money for something better later

What I don't understand, and perhaps others here can explain is why g-synch should require (for technical reasons) a nvidia graphics card if all the work is being done on the monitors side with the FPGA.

It would then be a completely different story and nvidia gets tostill monetize the technology.

nvidia has two options

  • either open up gsync to amd (not happening in a million years) to remain competitive
  • wait for it to die out as the majority of monitor manufacturers opt for the free standard they don't have to pay for

gsync was doomed as soon as freesync became part of the vesa pushed standard

i for one (being an amd user) will be patiently waiting for freesync, i shall grit my teeth and put up with screen tearing (the lesser of the two evils between tearing vs juddering)

...for now, anyway


The answer is simple. As with any programmable system you can white list hardware... I am betting you its a whitelist. It could also be a slight modification of the protocol but I am no expert. Thus because of this you must have an NVIDIA GFX card

competitively, consistency is typically better then higher fps or hz. i wonder if the hz being dynamic in a gsync or freesync monitor will make it less desirable for competitive FPS.

Yes that is what I am thinking, but I wonder if there are any reasons, following that my question would be what stops hackers from reprogramming the FPGA to trick it into accepting a MD card or flashing the firmware/bios to truck the monitor?

You don't need to. Freesync is basically the same thing, no extra hardware in the monitor needed.