1080p. How much anti aliasing is needed?

Hey logan and others. I was wondering if there was a way to work out the maximum amount of anti-aliasing needed for different size monitors. For example you would need less AA for a 22" 1920x1080 monitor than a 27" monitor. So do you guys think that there is a way to accurately work out how much AA people should use or is the only way to use trial and error?

You don't need AA at all on monitors with high pixel density, except supersampling and FXAA. Supersampling is the best but eats performance, FXAA is noticably better with minimal performance drop. You also don't need to play on high settings,it's just a privilege of having a good computer, some people actually preffer even the lowest settings for obvious reasons.

Since most new games use enormous textures and bad engine or level optimisation, they'll eat up vram pretty fast. The lower you have, the lower you go in AF and texture resolution. AA performance is dictated by pixel fillrate and bus width of a video card. If you can't see any difference between 8x and 16x AF for those games, use 8x. If you see noticable lag when activating AA, mess around with it until you get a good framerate compromise of performance-quality. A game can be playable at 24 fps you know... it's best that that game isn't Quake or CS or you're screwed.

MSAA is old and there's no tangible difference between 4x and 8x and 16x. QSAA is even blurrier.

FXAA is the best compromise between quality/performance if it doesn't make the game textures and geometry blury, even if the game doesn't support it (driver-side AA forcing). If you can afford the performance drop, SSAA or CSAA are the best image quality AA types.