Question(s) related to FreeSync

So, I’ve noticed this issue where some games caused my screen to flicker, while others didn’t. To be more specific, I didn’t notice any flickering in Witcher 3 while it was quite noticeably in Red Dead 2. I spent quite some time to figure out what was going on and tried various thing until it came to be that FreeSync (Adaptive Sync) could be the culprit.

Apparently, flickering occurs when the FPS drop under the minimal FreeSync range (48Hz on my particular screen). Therefore, everything under 48fps should result in flickering. Given that my monitor has a resolution of 5210x1440 with W3 running at 60fps and Red Dead 2 running at around 40-45fps it is apparent, why I didn’t notice it in W3.

However, there exists a tool which allows to decrease said minimum:

That being said, what I’m wondering is if I set this minimum to 30Hz, there should be flickering due to 30Hz. At least that’s what I think should happen. Any experience with that?

Furthermore, wouldn’t doubling the frequency in certain cases solve this problem? E.g. given a monitor with a FreeSync range of 60-120Hz and an arbitrary game. If this game runs at 70fps the monitor runs at 70Hz and everything is fine. But if the fps drop to 59fps, the frequency is not 59Hz but 118Hz. For 30fps the frequency would be 60Hz.

Does anyone here know why this is not the case? Is my assumption wrong or am I missing something?

This topic was automatically closed 273 days after the last reply. New replies are no longer allowed.