Do you need AA while playing at 1080p?

Hello, forum!

Lately I've been playing some games at 1080p at 8x Anti-Aliasing, but I'm having some fps drops from time to time.

I was thinking of lowering down AA, what are your thoughts is it noticeable? Do you really need high AA while playing at 1080?

Share your thoughts!

You don't have to use it. There's sufficient need for AA at 1080p. Try using 2x or even 4x AA. Maybe use FXAA, which is less demanding that MSAA et cetera.

I don't need AA at 1440p.

this topic is very subjective. a lot of people cannot stand the jaggies but there are some who aren't bothered by it. the least you can do is to try playing your game without AA first before asking others for input. you are the one who's gonna be playing with your pc and not us. however, since i'm already replying to your question then i feel that i am obligated to give my 2 cents on the subject. if you are having dips in fps when you are running AA (typical on 4x and above MSAA) then you can try turning in-game AA off and try to inject it with SMAA instead (2x samples is pretty good and efficient). try the smaa injector or sweetfx.

It does kind of depend on the graphics of the game, playing Crysis 1 on high and AA turned off and on, I didn't notice any difference but in Garry's Mod, turning AA on did make jagged lines look MUCH smoother.

I play at 1440p too.

I disagree to the point of I don't need as much AA , might be due to me sitting closer to my monitor because my desk is rather shallow

Increasing resolution and anti-aliasing are answers to different questions, and neither one is a substitute for the other.

As has been said, it's subjective. If the jagged edges don't bother you at whatever resolution, then you don't need AA, but they do not go away at any resolution.

It's not that I don't notice jaggies at this resolution. It's just that it isn't enough to bother me. I would definitely use AA at 1080p.

Can someone post a screen shot of AA vs no AA. Cause I never really understood what it did, and I never really saw jaggy's so I never ever use it, but apparently I just don't know what I'm looking at. To this day I don't run it unless I absolutely can, like if I get above 90fps I'll turn it on a little. To me I never really saw a difference, but I could just be naive.

yup.

to each his own.

I sit close and it's more noticable

when rendering an image a pixel will be a certain color. what anti aliasing does is sort of blend the colors of the girl with the background to make it more sseemless. Withoutit you get a staircase of colors around the edges.

I find 2xMSAA to be the right amount at 1080p.

Any more, and it's not noticable enough to justify the performance drop.

At some point, the resolutions will be so high that we wouldn't need AA, but for 1080p 2x MSAA should be enough depending on the game, although honestly I only really use SMAA or FXAA these days.

Frank answer: Yes, I think you need some form of AA in 1080p because I can honestly and objectively see jaggies, at least when I sit close enough.

Nvidia gpu's get the best aa and ways to force it, I first discovered csaa in the dolphin emu running opengl. The performance hit from csaa is minimal compared to regular msaa and looks better imo. Try csaa or a combination of csaa and smaa. I hate fxaa with a passion because it blurs the whole screen. AMD gpus got some other aa types too, I think the interesting one was eqaa and was introduced on the 6900 series.

Depends on pixel density. I typically run it at 2x to reduce the GPU overhead required to play the game.