The Sync-Wars: AMD, Nvidia, and VESA

I originally had this posted on my website, but I figured it'd be worth sharing here too. Jumping right in; remember G-Sync? The Nvidia-only proprietary technology that adds adaptive refresh rates to monitors? Now recall the recently announced Free-Sync? The exact same technology from AMD only without proprietary hardware? Well Nvidia recently responded to AMD's Free-Sync.

It's a good read, and clarifies a lot of things about G-Sync and Nvidia's intention in all of this. I decided to write a quick summary of events anyways because (as I'm sure most graphics programmers have done) I've looked into refresh rate work-arounds and technologies throughout my journey within computer graphics. So for the sake of context we'll cover everything from the ground up.

Starting from the beginning; within all display technologies, even cathode-based televisions, vertical blanking interval (VBI), or otherwise known as VBLANK, is the time difference between the last line of the first frame and the first line of the next frame. During a VBI no picture information is transmitted to the display to allow some time for the electron beam of the panel to return to the top of the screen. If you vary this value, you also vary the refresh rate of the screen (the number of times in a second that a display hardware updates its buffer). If you link or synchronize this with the rate at which the GPU draws and outputs frames, you then get an "adaptive" refresh rate. There are various names for this; refresh rate switching, variable frame rate, variable VBLANK, dynamic refresh rate, etc., but they all essentially mean the same thing.

                                                   Frame Syncing as explained by Nvidia

There are however numerous ways to perform this 'sync' stage. In Nvidia's implementation using the G-Sync module, the display holds onto a VBLANK until the next frame is received. The GPU ouputs whatever frame-rate the hardware can manage, while the monitor handles the 'sync' part. The opposite is true in AMD's implementation, where the VBLANK length is set as variable and the driver decides what VBLANK length to set for the next frame. It does this via an additional hardware buffer implemented onto the GPU to store frames for calculated release to the varied VBLANK lengths.

In AMD's implementation, speculation involved in setting the right VBLANK length for the next frame could cause some software overhead for the host system. That same overhead however is transferred to the additional display controller in Nvidia's implementation. Inversely, the G-Sync module is now required to deal with the overhead associated with the additional frame buffer. The display controllers inside Nvidia GPUs do not support dynamic refresh rates the way AMD's do, hence the reasoning behind Nvidia's deployment of external hardware.

With the differences between the implementations aside it's time to tackle some history, so let's rewind slightly.

Some years ago VESA created a free, better version of HDMI (which, for those not aware, is not a VESA-approved free standard, it was created by a separate for-profit group of corporations that charge a licensing fee for it's use) called DisplayPort. Laptop manufacturers said 'cool, a free display interface! Let's get that extended a bit for our use.' eDP is born, and it includes the power saving feature for "seamless refresh rate switching." AMD, Nvidia and even Intel support this, because it's in the standard for mobile/laptop display components (in addition to continually unifying their mobile, laptop and desktop hardware). Normal desktop monitors do not support eDP since it's an embedded interface, so nobody notices for years (although LG did pitch the idea back in 2011, the concept fell on deaf ears).

All the while VESA keeps doing their thing with regular old DisplayPort; releasing revisions, making it better and begin planning DisplayPort v1.3 which is due to release this year. Version 1.3 also introduces the same refresh rate switching feature that eDP currently has.

Nvidia sees this coming since they're a member of VESA. Who knows, its likely they even assisted in developing DP 1.3 and the logic controllers to support it. Yet someone at Nvidia decides; 'Let's pitch this old idea that's about to come to desktops anyway as our own before everybody else supports it because it's the freaking standard. Oh, and let's re-brand it so it looks like we came up with it too.' Some time later Nvidia reveals G-Sync, and everyone decides how variable refresh rates are a stroke of genius.

Fast forward approximately 4-6 months later to CES'14, and AMD clarifies 'We've had support for variable refresh rates since the Radeon 5000 series in our desktop GPU's. Have a look at our APU do the exact same thing as G-Sync on a regular laptop because laptops use the eDP standard for display hardware.'

Surprisingly, eleven years prior AMD (known as ATi at the time) had already filed a patent for hardware-based dynamic framerate adjustment (also the most likely reason why Nvidia do not have the same on-board GPU solution to frame-syncing as AMD, as explained above). This was implemented by AMD starting from the Radeon 5000 series three years ago for use with monitors that supported variable VBLANK, which monitor manufacturers (VESA and co.) never developed to implement into desktop displays.

It was quickly revealed that all Nvidia had done is re-produce and individually sell the same ASCI component to both consumers and even monitor manufacturers to support a feature of DisplayPort 1.3 before it became a ratified standard, but even worse - lock it down to function only with Nvidia hardware. It's probable that G-Sync is just eDP or DP 1.3 in camouflage, who knows since Nvidia avoid detailing the inner workings of their secret sauce (especially since it's likely to be a rehashed open specification).

More importantly though we can draw some conclusions from this. We can safely predict that all monitor manufacturers will support DP 1.3 within the next 4-8 months (Q2/3 2014), or ironically even sooner due to the recent G-Sync/Free-Sync wars turning heads at CES'14. In fact according to this article by PC Perspective, AMD mentioned "there might be some monitors already on the market that could support variable refresh rates with just a firmware update. This would be possible if a display was shipping with a controller that happened to coincidentally support variable refresh rates, perhaps in an early stage of development for the upcoming DP 1.3 standard."

From the same interview, AMD also asserted it "wasn't bringing this demo [FreeSync] out to rain on NVIDIA's G-Sync parade but instead to get media interested in learning about this feature of eDP 1.0 and DP 1.3, urging the hardware companies responsible to more quickly produce the necessary controllers and integrate them with upcoming panels in 2014." AMD's recent blog post on the topic seems to have many more Knight-worthy statements that I'm too lazy to quote from.

Admittedly I may seem to be slightly bias here, and it's because I'm being just that. I tip my hat to Nvidia for making some noise on how important and valuable variable refresh rates are.Truthfully speaking, people would have never cared for variable refresh rate had Nvidia not made a valiant sales' effort over the whole affair - but it comes to no surprise that they try to sell this as proprietary technology when it is already (or soon will be) a freely available feature in every VESA-approved panel. It's hard to see it as anything short of underhanded. While I doubt AMD's intention in Free-Sync was 100% honorable (they've certainly helped in downplaying Nvidia's efforts), they're at least trying to play catch-up in a healthy, open manner; and for that I must applaud them.

So as a final word for those wanting/waiting to use Nvidia GPU's solely for G-Sync, unless you're an early adopter with little care for budgets and performance/cost ratios; carefully consider you'll be paying a premium for what is being implemented as a free open standard for every PC user in the coming year.

If we were to introduce the subject of ethics and morals into this soap opera, what stance would be deemed right? I think my opinion on this particular showing is quite clear.

<\end_rant>

hi friend :D i never noticed you make these stuff too! i posted something similar a little while ago. you could add me on steam too. anyways, keep up the good work :D

(P.S. i agree with your thoughts as well. But i think that for many people (as in non hardcoure gamers normal xbone gamers) won't find this atractive. Would would really be killer if amd could combine their monitor tech into the consoles so that it will also work with T.V.s and such so that non tech nerdy people also have this kind of an experience.)

Very informative article.

Good read, thanks. :)

Great article.

What I love about AMD is their support for open standards. Nvidia constantly go for the closed systems like Shield, G-Sync etc. In the long term an open solution is better for everyone.

 

Well, shit. So long as there isn't a patent fight, I'll be happy.

Thanks for the post

Very informative! Learned a few more things :]

haha great article. 

Thanks all! good to know everyone learned a thing or two! :D

Chances are there won't be a patent war over this one. The technique and it's implementation is pretty old now considering there are many on-board refresh-rate solutions developed by other vendors like Intel and LG. I'm sure Nvidia can cook something up for themselves. Heck, you can do the entire thing in software so it shouldn't be more than a driver update for Nvidia once DP 1.3 is released.

Thanks for the article, I didn't know that this technology was already existing before.

I wanted to get G-Sync since I play quite a lot, but now I consider maybe waiting to see what happens with standards and AMD..

Thanks for the article, I didn't know that this technology was already existing before.

I wanted to get G-Sync since I play quite a lot, but now I consider maybe waiting to see what happens with standards and AMD..