At what PPI does AA become irrelevant?

Title says it all. What PPI makes anti aliasing irrelevant? The standard right now for monitors is a 24" 1080p screen which equates to ~92 PPI, so 24" at 4k is ~184 PPI. Most 4k monitors I've seen are 28" which is ~157 PPI. I am guessing that 157 PPI is enough to eliminate the need for AA for the most part, and 184 PPI would be very, very smooth looking.

 

It always seemed to me that AA was just a substitution for not having a high enough pixel density to remove aliasing. What do you guys think? Can we ever not need AA?

 

For reference, this handy little website gives PPI read outs really easily.

http://isthisretina.com/

Yeah i'd say that above 150 PPI would be fine without AA as long as you are the correct viewing distance away from the certain sized monitor.

IMO at 1440p on a 27" which is about 108 PPI I start to need AA less. Like a lot less. Then again I am not as sensitive to that kind of thing. 

At 4K I don't really think you need it at all. That is just me though.

I would want AA until you hit about 300. 

But to see individual pixels at 300 PPI you would have to be far too close to your monitor to use it?

That is the equivalent of 1080p at about 7" or 8k at 24". Somehow, I think that is a lot of overkill. Can you imagine a monitor with a pixel density as high as a tablet? Those things are meant to be two very different distances from your eyes. I would honestly want to see the various PPIs before I can say at what point it really stops making a difference, but I don't know of a realistic way to do that. I wish some tech site would do that as a test. Do you have any personal experience with high PPI like 150 or above?

 

I think there might be a difference for gaming, Anti-Aliasing and such, rather than for film and ability to resolve details.

This is a good site that puts things into context for size of screen, distance from screen and when 1080p/4K/8K matters. Film oriented, not gaming.

http://carltonbale.com/does-4k-resolution-matter/

I think that there is definitely a difference between gaming and film as far as the PPI situation goes. I think that everything looks great when I watch a movie at 1080p, but when I start gaming at 1080p, I always find myself needing AA, regardless of the game. This is with my 24" 1080p screen at about 37" which is retina according to the site in the OP and maintains the full effect of 1080p according to the graph on that chart, so by all means, I shouldn't need AA, but I do. I very much do. I think that this might be in part because film already has AA applied in most cases (they usually film in resolutions over 1080p and then scale it down), and doesn't really have as many harsh lines as games do.

 

So anyway, yeah, there is a difference and what works for film won't really work for gaming all the time. I would really like to see some testing on this with regards to AA in game.

it depends on the distance you are from the display ... and how good your eye sight is.    lol

Seriously though ... 4k res will only be a thing for gaming on PC monitors  ... if you are ten feet away from a 55" tv screen  ... it really doesn't make a huge difference ... because the human eye can not discern individual pixels of a 1080p let alone a 1440p at that distance. 

What is needed is higher and "smarter" refresh rates to make it all smooth and seamless. 

Yeah, i think it works differently for games. even when films are shot in 1080p you don't need anti-aliasing as the cameras have anti aliasing filters built in to their sensors so it happens at capture. for those that shoot in a higher resolution and downscale it usually doesn't matter as the sensors and lenses resolve the detail and you don't get artifacts like moire or aliasing.

With games, you're not capturing light on a sensor, so the issue might be different so there might be more to it that just the PPI of the display and distance, i.e. issues related to how these images are created/rendered.

Now if only we knew of a tech group with access to monitors with PPIs which range from the standard 90 to the new standard 150.......... If only we could find someone like that, and then get them to do some tests on how noticeable the aliasing is.

Presumably when you can no longer visually identify individual pixels...... I like the idea of Nvidias MFAA though, 4xmfaa for the performance cost of 2xmsaa. That is probably quiet good at 2560x1440 (although I don't know for sure since I don't have a monitor of that resoloution), but it has a slightly higher pixel density than my monitor. I imagine past that you start seeing diminishing returns set in pretty fast.