After updating to the newest Crimson driver I noticed some odd new behavior I haven't seen yet from my monitor.
My monitor has a freesync range of 32-75hz and when I noticed the framerate dropping under 32hz it locks the frequency at 32hz whilst displaying the images just like a monitor would without vsync enabled.
The new behavior I have noticed now is that instead of locking the frequency of the monitor to 32hz it instead doubles the frequency to display the same image twice.
Essentially if my program/game runs at 25hz my monitor gui reports that the screen is running at 50hz.
I thought only gsync was able to do this.
This is something that I have heard about for a while as far as AMD supposedly implementing it. Good to know that it is finally here.
In a way this renders the display's sync range all the way down to 0.
As long as the monitor can keep it at a multiple of the actual FPS being rendered that is.
I thought only gsync was able to do this.
was gona say the same thing. makes you wonder what the fuck that g-sync addon board ($100) is for.
its just so nvdia fanboys will think "hey thats $100 more then without gsync it must be AWSOME!!! :D" my impression of a nvdia fanboy they are so used to paying so much for stuff that doesnt perform much better then the last gen
Reminds me of some liquor brand (think it was tequila of some sort) which wasn't selling well. What did the company do? Raised the price. They didn't change anything but the price, but that increased the perceived value. It went from being a cheap liquor to being a classy, expensive liquor (if tequila can ever be classy). Anyway, they sold more bottles because people are dumb and go off of the perceived value instead of what they are actually getting. Same thing with G-sync vs free sync.
Right now I can't really say "gsync sucks" because gsync is more mature than freesync, and a lot of times gsync monitors tend to be of higher quality from the get go because Nvidia can choose what goes where.
But I'm really wondering why they're using dedicated hardware onboard for the adaptive sync, I don't see any reason to do it because everything that it does can be done through software.
Or am I missing something?
i think it was part of the recent trend to solve problems by throwing hardware on it. but now that hardware solutions are slowing down and becoming expensive that people are finally going back to the software solution.
Nvidia didn't cram hardware into the gsync laptops, more or less using the display port's native ability.
I think that speaks for itself when it comes to the whole ordeal, as they're using what freesync is more or less because they they don't get to cram hardware in there.
Its really hard to speak good about gsync when I can't think of any reason for the gsync module to exist other than create more shills for nvidia.
ahh yeah almost forgot about that. smh
there are now more freesynce monitors than there are g-sync and g-sync came out a whole year earlier
Yep, because anyone can pick up freesync and plaster it over their product.
Just look at all those 4k Korean monitors for example.
I don't think they're going to be able to do what I stated in the OP, as they only go from 60 to 40ish HZ.
You need to at least be able to have half the original max frequency in the range I think, when it drops under 40 it cannot double fresh it to 80hz.
Yeah, looking a the Crimson release notes they did improve the Freesync behavior:
Freesync™ Enhancements:Minimum/Maximum display rate is now listed in Radeon SettingsLow
framerate compensation to reduce or eliminate judder when application
FPS falls below the minimum refresh rate of an AMD Freesync™ enabled
displayNew support for AMD Freesync™ with AMD Crossfire™ in DirectX 9® titles
These improvements along with Intel moving towards Adaptive sync support really obviates G-sync.
If they found a way to do this without the vram buffer it puts it on par with gsync. The only other issue i saw was the ghost images but i didn't keep up to see if that was fixed.
As far as prices goes thought some of the first and second wave gsync monitors are on fire sale so price really isn't a issue.
It's the broader range, somewhat smoother game experience and the ability to control the overshoot while overclock that bit better. Currently G-Sync is propriety hardware but eventually it will hopefully become mainstream for all monitors as the standards are there at least already. Which pretty much describes how a lot of new tech comes to the market. People are prepared to pay the early adopter tax to get the features they want.
Still AMD do seem to be pushing the market in the right direction and I'll have more choice next year.
I looked into this and apparently they can do this same behavior that g sync dose without the frame buffer because when the frame rate drops the gpu itself sends the same frame a second time rather than pulling from an on board frame buffer.
I don't know enough of how this solution would be better or worse but basically the gpu dose the frame doubling instead of the panel.
Yep, monitor is literally the GPU's slave with freesync, as it should be.
Wow that sounds really good and opens some of the small freesync windows that are out there.
Good move amd